The University of Pretoria, South Africa, in collaboration with researchers from Utrecht University and the University of New Mexico, has used Lumivero's (previously Palisade) DecisionTools Suite software to create a low-cost and easily implementable model to estimate foodborne contaminations and human infections from the avian influenza H5N1 virus in Egypt and Nigeria. The output of this surveillance model, when combined with data generated by several international health organizations, will enable other African countries to better predict, monitor and intensify efforts to eradicate the spread of this highly contagious disease from animals to humans. This work was covered in the article, Development of Disease-specific, Context-specific Surveillance Models: Avian Influenza (H5N1)-related Risks and Behaviours in African Countries, published in April 2015.
The avian influenza virus – or avian flu – is a fast-spreading infection that affects poultry and potentially people worldwide. While the virus has already adapted to other mammals, including ferrets, guinea pigs and cats, the risk to humans is still not completely understood. This makes monitoring and decreasing the rate of contact between infected poultry and humans critical – in particular, stopping exposure to the virus through the production and preparation processes of contaminated food. According to Dr. Folorunso Oludayo Fasina, a senior lecturer at the University of Pretoria’s Department of Production Animal Studies, it is critical to understand “how the virus gets into the food system, how it spreads and how it can be managed. To do this, we need risk assessment and exposure assessment, as well as a response model. Once we have this information, we can implement measures to stop the risks.”
As the University’s Department of Production Animal Studies has significant expertise in disease modelling and risk prediction as part of their epidemiological work, they allowed Dr. Fasina and his colleagues to create a model for foodborne contamination that was specific to Africa, where the virus has already infected 12 countries. The team studied both biological and cultural aspects, including food processing, trade and cooking-related practices, and collected data from more than 375 Egyptian and Nigerian sites including homes, local producers, live bird markets, village and commercial abattoirs and veterinary agencies. According to Dr. Fasina, “We took a ‘from the farm to the fork’ approach, and considered farms as well as livestock markets.”
“Risk mitigation and risk prediction remain some of the most useful tools with which to effectively control the continuous perpetuation of outbreaks in newer territories and eradicate it where it currently exists,” explained Dr. Fasina. However building this new model wasn’t an easy task, taking nearly two years to complete. Most of the existing information was qualitative, which made it difficult to set quantitative parameters, and the quantitative data they did find was inconsistent as it was often out of date, only available for other types of influenza or had been censored by the government. However after attending a training session for DecisionTools Suite in 2013, Dr. Fasina decided to use the software to generate the quantitative values they needed.
The team considered several factors with their model, from the concentration levels of the virus in infected meat and the likelihood of contamination between infected and non-infected meat, to differences between genders and age groups with regard to risk exposure. “We asked a lot of questions to generate this data,” explained Dr. Fasina. “This generated a significant amount of output, which required sensitivity analysis and some triangulation.” As a first step, the team used TopRank tool, part of the DecisionTools Suite, to analyze the sensitivity of each of the identified contributors to the overall risk. This helped the team understand which of the contributors were the most important.
Next, the team moved to the @RISK tool in the DecisionTools Suite to help predict the different ways the virus could be spread. Using Monte Carlo simulation, @RISK can quantify the probabilities of different outcomes – or infection rates – occurring, as well as determine the optimal preventive measures to mitigate the risk of animal-to-person infection. The team used six statistical probability distributions within @RISK to represent different inputs – or risk factors – for their model. They combined the simulated outputs from @RISK with statistical analysis to complete the model, using social data and outbreak information, including human demographic structures in Africa, socio-cultural and behavioral economics and knowledge, and attitude and perceptions or risks within those countries being investigated.
The results revealed numerous opportunities for the avian influenza virus to be spread, and found that the estimated risk for humans was higher than previously reported. “It is very easy for us to miss the influence of viral infections on a community, due to lack of awareness and under-reporting, so people may be more at risk than we’re aware of,” explained Dr. Fasina. “@RISK is a valuable tool to investigate these problems and do risk predictions either prospectively or retrospectively. Utilizing the outputs from models like this can help health policy planners and public health officials to take anticipatory measures to prevent future disasters associated with infectious diseases like the avian flu.”
Originally posted: Feb. 8, 2017
Updated: June 7, 2024
In new ventures there is often very little known about the market that is to be exploited. There is no sales history and often no experience delivering the product or service in question. Sometimes it might not even be known if the product that is the center of the business will work. Many of these unknowns can be resolved to some extent through market research and by trying less risky versions of technology before committing to a full blown venture.
However, the lack of business history is paramount. With no history, the suitability of a business model is hard to be sure of. A chosen business model might be overly sensitive to typical random fluctuations in the assumed parameters. Compound this with the above sources of ambiguity and it can be seen how the risk and uncertainty in new ventures is different from that of many other ventures.
It will always be the case that there are those unknown events that could propel a new venture into high levels of success. There will also be other unknowns that can kill a new venture. The economist Frank Knight called these factors “uncertainties” to distinguish them from “risks,” with which could be associated probabilities.
Until the sensitivity of a model to randomness is known, the business model itself is an uncertainty. This adds further difficulty to new venture management. Researchers Dr Clint Steele and Kourosh Dini of Australia’s Swinburne University of Technology decided to use Lumivero's (previously Palisade) @RISK software to resolve this issue. With Darcy Naunton of the Australian venture capital firm Adventure Capital they put this idea to the test. The plan: use @RISK to apply probabilistic design to a new venture.
Of the many ventures going on at Adventure Capital, a new mobile app called Omny by 121Cast was selected for the @RISK analysis. Omny by 121Cast allows users to manage their online listening of internet radio and other audio services for total customization. Some of the unknowns of the business plan were:
Identifying the sources of potential randomness was just the start. The key to probabilistic design is modelling the flow of variances. That is – the model needs to be set up in such a way that when random fluctuations in the inputs occur, they have the correct effect upon the outputs.
This is perhaps obvious to many users of @RISK. However, many business plans and their financial models are not usually put together in such a way. Often, fixed numbers that “seem right” to the entrepreneur are allocated to each cell within a spreadsheet. The model will balance, but if one changes a cell value for, say, the market size, then the other cells are unlike to change much. If they do, then it will unlikely be in a logical manner. An increase in sales, for example, may not cause a corresponding increase in operations costs, asset purchased, or administration costs.
Creating accurate models for these relationships was the first step. This was easy for the typical issues. The relationship between market size and the cost of customer support is a good example. However, other relationships are trickier to model. For instance, how does an increase in sales affect the position of an app in an app store listing?
Given the risks and uncertainties with new ventures, it would be expected that entrepreneurs would be accustomed to providing some insight into the nature of the expected randomness. However, until someone really needs to do this, it’s not a skill that is developed much.
Fortunately, the ability to specify a distribution by percentiles with @RISK makes this challenge much easier. A distribution in @RISK is simply a range of values that describes the different possibilities of a particular unknown. Distributions are defined with either statistical parameters such as minimum, maximum, mean, or standard deviation, or by using percentiles. Percentiles are often much more intuitive for people to understand. Considering a value range that was equally likely to contain the actual value as it was for a 90% biased roulette wheel to come up “red” makes it easier for people to use their experience. This method will sound familiar to those who have read Douglas Hubbard’s book How to Measure Anything. Including the expected value (treated as a 50th percentile) made it easier to predict skew.
With the model done and the distributions specified, it should have been time to understand the risks that were once uncertainties. However, the application of probabilistic design to new business ventures with @RISK offered a new benefit.
One of the hallmarks of design is something called coevolution. This occurs when, in the process of trying to solve a design problem, that problem becomes better understood. In other words, sometimes you need to try solving a problem even if you do not fully understand it. This process then enables you to understand it.
New business ventures are ideal for coevolution, and @RISK can help an entrepreneur “design” a business plan.
“Because you have to create a model that allows for the flow of variances, a lot more thought needs to go into how the proposed business will run. This makes the entrepreneur think more about the specifics of the business and removes even more unknowns again,” notes capital venture manager Darcy Naunton. “I now know a lot more about this business than I ever would have otherwise.”
This extra insight into a business plan that @RISK forces upon an entrepreneur provides for a much deeper understanding. This is an understanding that also allows an investor to have more faith in an entrepreneur and their plan.
When it comes to new ventures, there will always be unknowns. That’s just something the entrepreneur and the investor need to deal with. It is what entrepreneurs do – deal with uncertainty. However, @RISK can be used to eliminate uncertainty about the sensitivity of the business model to expected randomness. This is different from removing uncertainty, or randomness, entirely – that’s impossible. But by mitigating some of the guesswork around the sensitivity of a business plan to various external fluctuations an entrepreneur can now focus uncertainty management skills on a smaller area and apply those skills more intensely.
“The defining feature of an entrepreneur is dealing with unknowns, and using @RISK allows an entrepreneur to really focus this skill on fewer unknowns,” says researcher Clint Steele after reflecting upon the process of applying @RISK to the Omny by 121Cast business plan. “The greater understanding of the business that comes from this process is an extra unforeseen benefit.”
Both Naunton and Steele agree that this is new territory for the venture capital industry and entrepreneurialism. “We are still thinking about the best way to present the information to investors so that they understand the extra knowledge we have, but without confusing anyone who needs to look at a lot of proposals in a short period,” says Naunton. “The variety of succinct graphs and reports in @RISK really aid in this communication effort.”
The best way to incorporate probabilistic design with @RISK will be the next research project for Clint Steel and Kourosh Dini. Notes Kourish Dini said “This is a very new approach, and it might take some time, but I think that this new standard of due diligence in business investment from start-ups to corporate ventures, once properly developed, will likely become the norm. It’s good that there are companies out there like Adventure Capital that are willing to try this.”
Originally published: Oct. 6, 2022
Updated: June 7, 2024
What do banks, bond-rating agencies and homeowners in places like Las Vegas have in common? They all grossly misjudged risk and, as a result, made bad decisions during the recent housing bubble.
That’s why Dr. Arnoldo Camacho, a professor at the highly regarded INCAE Business School in Alajuela, Costa Rica, incorporates Lumivero’s (previously Palisade) @RISK software in his MBA courses – Finance I and Financial Institutions and Capital Markets.
“The Finance course focuses on the creation of value through efficient decision making, which involves risk analysis,” says Dr. Camacho, who has taught at INCAE for 22 years. “In the Financial Institutions and Capital Markets courses, an in depth analysis of credit risk requires the estimation of the probability of default of issuers of debt.
“In both courses, @RISK is used for simulation and sensitivity analysis.”
Camacho was introduced to @RISK through INCAE, where his courses attract typically attract 60 to 65 students.
When asked why it is important to expose his students to @RISK, Camacho says, “It is easy to handle, and it allows students to move from uncertainty to risk analysis, which requires critical thinking.”
Originally published: Oct. 7, 2022
Updated: June 7, 2024
When people talk about leverage, they are most likely talking about financial instruments or mechanical advantage. But when Phil Rogers talks about leverage, he could very well be referring to the use of powerful PC software. Phil teaches Managerial Decision Making to MBA candidates at the University of Houston’s C. T. Bauer College of Business where most of his students are managers with industry experience—and this is where leverage from software comes in.
These students need analytical tools that offer both a short learning curve and the ability to accurately model their real-life problems. Phil uses Lumivero's DecisionTools Suite in the classroom because “the leverage you get by using these tools in Excel is phenomenal. My students can very quickly learn and apply the techniques they see in class to difficult decision-making problems faced by their firms.”
To complement the quantitative analysis provided by the DecisionTools Suite, Phil uses a stand-alone tool called Expert Choice, which is excellent at dealing with the more qualitative objectives and criteria that go into the decision-making process.
Given their industry experience, students in Phil’s decision-making classes have current problems they are dealing with on which they can use the newly learned tools and techniques. One team of students developed a model to find the optimal allocation of 1,000 turbines to five wind farms. Students working for a major oil company optimized the frequency of scheduled plant shutdowns. Yet another student, working for a giant natural gas enterprise, determined the most cost-effective use of its inventory in the face of uncertain demand for gas and the changing spot price for gas. And another student, working at one of Houston’s largest hospitals, developed a model to determine the optimal deposit for the hospital to collect from patients undergoing organ transplant operations given uncertainties in insurance coverage and the ultimate cost of the procedures.
Perhaps the students in greatest need of the “leverage” this software offers are the managers from Sinopec and CNPC, the two largest petrochemical companies in China. As part of their Executive MBA program offered in Beijing through the C. T. Bauer College of Business, these students have three days and evenings to learn the material normally taught over a semester back in Houston and, on the fourth day, to present the results of their application of what they have learned to decision-making problems they currently face. Phil reports, “They are able to do it, solving significant business problems.” And, he points out, that couldn’t happen without real leverage.
Originally published: Oct. 12, 2022
Updated: June 7, 2024
Many California produce farm operations use a rule-of-thumb to determine a hedge ratio for their seasonal productions. They often aim to contract 80% of their crop in advance to buyers at set prices, leaving the remaining 20% to be sold at spot prices in the open market. The rationale for this is based on many years of experience that indicates costs and a reasonable margin can be covered with 80% of production hedged by forward contracts. The hope is the remaining 20% of production will attract high prices in favorable spot markets, leading to substantial profits on sales. Of course, it is understood spot prices might not be favorable, in which case any losses could be absorbed by the forward sales.
Since the Recession of 2008, agricultural lenders and government regulators have recognized that many farm operators need to manage the risks to their margins, and free cash flows, rather than simply focusing revenue risks. A more quantitative analysis is needed to determine risks in the agricultural industry.
Agribusiness experts from Cal Poly conducted a risk management analysis using @RISK, and found the 80% hedge ratio rule-of-thumb is not as effective as assumed. Growers do not profit from spot market sales over the long run. The analysis shows growers are better off in the long-term selling as much of their product as possible using forward contracts.
Agriculture in California is big business. In 2013, nearly 80,000 farms and ranches produced over 400 commodities – the most valuable being dairy, almonds, grapes, cattle, and strawberries – worth $46.4 billion. Almost half of this value came from exports. The state grows nearly half of the fruits, nuts, and vegetables consumed in the United States. Yet agriculture is traditionally one of the highest risk economic activities.
Steven Slezak, a Lecturer in the Agribusiness Department at Cal Poly, and Dr. Jay Noel, the former Agribusiness Department Chair, conducted a case study on an iceberg lettuce producer that uses the rule-of-thumb approach to manage production and financial risks. The idea was to evaluate the traditional rule-of-thumb method and compare it to a more conservative hedging strategy.
The grower uses what is known as a ‘hedge’ to lock in a sales price per unit for a large portion of its annual production. The hedge consists of a series of forward contracts between the grower and private buyers which set in advance a fixed price per unit. Generally, the grower tries to contract up to 80% of production each year, which stabilizes the grower’s revenue stream and covers production costs, with a small margin built in.
The remaining 20% is sold upon harvest in the ‘spot market’ – the open market where prices fluctuate every day, and iceberg lettuce can sell at any price. The grower holds some production back for spot market sales, which are seen as an opportunity to make large profits. “The thinking is, when spot market prices are high, the grower can more than make up for any losses that might occur in years when spot prices are low,” says Slezak. “We wanted to see if this is a reasonable assumption. We wanted to know if the 80% hedge actually covers costs over the long-term and if there are really profits in the spot market sales. We wanted to know if the return on the speculation was worth the risk. We found the answer is ‘No’.”
This is important because growers often rely on short-term borrowing to cover operational costs each year. If free cash flows dry up because of operational losses, growers become credit risks, some cannot service their debt, agricultural lending portfolios suffer losses, and costs rise for everybody in the industry. Is it a sound strategy to swing for the fences in the expectation of gaining profits every now and then, or is it better to give up some of the upside to stabilize profits over time and to reduce the probability of default resulting from deficient cash flows?
Slezak and Noel turned to @RISK to determine an appropriate hedge ratio for the grower.
For inputs, they collected data on cultural and harvest costs. Cultural costs are the fixed costs “necessary to grow product on an acre of land,” such as seeds, fertilizer, herbicides, water, fuel etc., and tend to be more predictable. The researchers relied on the grower’s historical records and information from county ag commissioners for this data.
Harvest costs are much more variable, and are driven by each season’s yield. These costs include expenses for cooling, palletizing, and selling the produce. To gather data on harvest costs for the @RISK model, Slezak and Noel took the lettuce grower’s average costs over a period of years along with those of other producers in the area, and arrived at an average harvest cost per carton of iceberg lettuce. These costs were combined with overhead, rent, and interest costs to calculate the total cost per acre. Cost variability is dampened due to the fact that fixed costs are a significant proportion of total costs, on a per acre basis.
The next inputs were revenue, which are defined as yield per acre multiplied by the price of the commodity. Since cash prices vary, the grower’s maximum and minimum prices during the previous years were used to determine an average price per carton. Variance data were used to construct a distribution based on actual prices, not on a theoretical curve.
To model yield, the grower’s minimum and a maximum yields over the same period were used to determine an average. Again, variance data were used to construct a distribution based on actual yields.
StatTools, included in DecisionTools Suite, was used to create these distribution parameters. @RISK was used to create a revenue distribution and inputs for the model. With cost and revenue simulation completed, the study could turn next to the hedge analysis.
Since the question in the study is about how best to manage margin risk – the probability that costs will exceed revenues – to the point where cash flows would be insufficient to service debt, it was necessary to compare various hedge ratios at different levels of debt to determine their long-term impact on margins. @RISK was used to simulate combinations of all costs and revenue inputs using different hedge ratios between 100% hedging and zero hedging. By comparing the results of these simulation in terms of their effect on margins, it was possible to determine the effectiveness of the 80% hedging rule of thumb and the value added by holding back 20% of production for spot market sales.
Unsurprisingly, with no hedge involved, and all iceberg lettuce being sold on the sport market, the simulation showed that costs often exceeded revenues. When the simulation hedged all production, avoiding spots sales completely, the costs rarely exceeded revenues. Under the 80% hedge scenario, revenues exceeded costs in most instances, but the probability of losses significant enough to result in cash flows insufficient to service debt was uncomfortably high.
It was also discovered that the 20% of production held back for the purpose of capturing high profits in strong markets generally resulted in reduced margins. Only in about 1% of the simulations did the spot sales cover costs, and even then the resulting profits were less than $50 per acre. Losses due to this speculation could be as large as $850 per acre. A hedging strategy designed to yield home runs instead resulted in a loss-to-gain ratio of 17:1 on the unhedged portion of production.
Slezak and his colleagues reach out to the agribusiness industry in California and throughout the Pacific Northwest to educate them on the importance of margin management in an increasingly volatile agricultural environment. “We’re trying to show the industry it’s better to manage both revenues and costs, rather than emphasizing maximizing revenue,” he says. “While growers have to give up some of the upside, it turns out the downside is much larger, and there is much more of a chance they’ll be able to stay in business.”
In other words, the cost-benefit analysis does not support the use of the 80% hedged rule-of-thumb. It’s not a bad rule, but it’s not an optimal hedge ratio.
Professor Slezak is a long-time user of @RISK products, having discovered them in graduate school. In 1996, “a finance professor brought the software in one day and said, ‘if you learn this stuff you’re going to make a lot of money,’ so I tried it out and found it to be a very useful tool,” he says. Professor Slezak has used @RISK to perform economic and financial analysis on a wide range of problems in industries as diverse as agribusiness, energy, investment management, banking, interest rate forecasting, education, and in health care.
Originally published: Oct. 13, 2022
Updated: June 7, 2024
Students in corporate finance classes at Illinois State University’s College of Business rely on Lumivero’s @RISK (previously Palisade). Dr. Domingo Castelo Joaquin teaches Advanced Corporate Finance in a special MBA program for executives and managers from Archer Daniels Midland and other neighboring firms such as Caterpillar, and he also teaches undergraduate seniors in the University’s College of Business. Both groups of students use @RISK to learn advanced financial analysis techniques using Monte Carlo simulation.
Dr. Joaquin holds advanced degrees in theoretical statistics and business administration, and he is an advocate of Monte Carlo simulation and @RISK because they offer a pragmatic approach to dealing with the uncertainties that are inherent in any business decision: “Since the future is unknown,” he says, “it would be implausible to use point estimates to represent future values of critical success factors. Range estimates are an improvement, but @RISK is even better because it is able to reflect the important fact that some values within the plausible range may be more likely than others. More than that, the correlation capability of @RISK allows the analyst to model the fact that co-movements of variables in the same direction may not be as likely as co-movements in the opposite direction.”
Dr. Joaquin’s teaching with @RISK takes place in a computer lab, which allows each student hands-on opportunities to follow his lectures and demonstrations. He guides them in using @RISK to address an array of thorny problems in real options analysis––his area of research specialization––as well as in capital budgeting, cash budgeting, capacity planning, and international portfolio diversification.
Whether it be in the undergraduate class or in the MBA class, certain questions frequently arise, like “How do know I have the right distribution? What happens if somebody else at the company gets a different answer?” He responds to these concerns by explaining that variation in results is normal. “What is important is to come up with credible distributions and credible conclusions. Arriving at this requires a mixture of objective analysis, subjective projections based on experience, and informed judgment from the experts.”
Another frequently asked question is “How do I convince my boss that the simulation is credible?” Responding to this, Dr. Joaquin offers a strong piece of strategic advice: “Enlist the participation of different departments in the creation of the simulation from the very beginning. Get their input in making the original assumptions. Early on, be sure to involve the higher-ups in painting the big picture and in evaluating the general structure of the model––you’ll find, for instance, that using the tornado diagram to evaluate the sensitivity of the outcomes to each of the input variables in your simulation is particularly useful in focusing their attention on the factors that matter most to their organizations.”
Dr. Joaquin’s students at both levels are eager to explore all the capabilities of @RISK. He attributes this to the software’s user friendliness. “Just like Excel itself, @RISK does its work without involving you in its computational complexities. So students who may have been scared of stats and data analysis before can feel perfectly at ease creating their simulations––and so, they want to do more.”
When exam time comes, Dr. Joaquin cooks up with a zinger of a business case and asks the students to fire up their computers and send him the deliverables—their executive memos with recommendations and the spreadsheet work in @RISK that led to these conclusions.
His cases are complex, just like situations in everyday corporate life. But Dr. Joaquin believes that @RISK puts the resolutions of the exam case within reach of most of his students. “@RISK represents a major advance in the business analyst’s toolkit. By sparing the analyst the trouble of having to reinvent simulation in an Excel environment, @RISK allows the analyst to focus on structuring problems that make managerial sense and on interpreting results for the purpose of supporting executive decisions.”
Updated: May 2, 2024
Originally Published: Oct. 28, 2022
Power plants can estimate short-term demand for electricity with a fair degree of confidence when dealing with traditional, controllable energy sources, such as water and natural gas. However, this is not the case when dealing with solar- and wind-generated power, which can vary significantly on a day-to-day basis. Maintaining system stability where solar and wind play a significant role in generating electricity is a growing challenge facing utility operators. Lumivero’s (previously Palisade) risk analysis software was used to create a planning methodology to assist utilities that are making major investments in these power sources, which is detailed in the minicourse Integrating Renewables with Electricity Storage, presented by Roy Nersesian, a professor at the Leon Hess Business School at Monmouth University and author of the book, Energy Risk Modeling.
Solar power is inherently unreliable, fluctuating with time of day and degree of cloudiness, and wind power is a victim of weather patterns. As solar and wind outputs are uncertain, they are uncontrollable. This means they require virtually 100% backup with fossil, nuclear, and hydro sources of power to prevent blackouts. Think of the repercussions of a solar eclipse and calm winds on renewable energy output, which occurred in Europe in 2015.
Solar and wind can be transformed to become more controllable sources of power if there is enough storage of electricity to compensate for when there is too much cloud cover, or wind speeds drop significantly or are too high for wind turbine operation. “Electricity storage can be compared to an inventory of products – products are stored when demand is low, then become available when demand goes back up,” explained Roy Nersesian, a professor at the Leon Hess Business School at Monmouth University. In a similar way, if unpredictable solar and wind output can be directed to and from electricity storage of sufficient capacity, then these energy sources can be transformed to become a more controllable and reliable power supply.
Unfortunately, conventional electricity storage batteries cannot handle a utility-sized system of both conventional and renewable power supplies. However, pumped storage plants – or gravity batteries – can store and supply electricity to cover the mismatch between electricity supply and demand.
A pumped storage plant consists of two water reservoirs at different heights, fitted with reversible pump-turbines, which shift water between these reservoirs. Electricity is generated by the gravity flow of water, moving from the upper reservoir to the lower reservoir. At times of high electrical demand, water is released into the lower reservoir through a turbine, which generates electricity. At times of low electrical demand, surplus electricity is then used to pump water back to the higher reservoir.
With @RISK, utility operators can now more easily determine if – and how – they can leverage pumped storage plants, and transform wind and solar into reliable sources of power.
To create the methodology, Nersesian used @RISK simulation software and its Fit Distribution feature to determine the output of a system of solar and wind farms. He modelled different sizes of farms, which were located in different sites, to obtain a probability distribution of the power supply. For the purposes of his research, he started with three 1-megawatt solar sites and three 1-megawatt wind sites, then expanded to farms with larger outputs to model a utility-sized enterprise. The wind sites were statistically linked with the Copula function in @RISK 7 to correlate variables, while the solar sites were statistically linked using conditional probabilities. However, a single Fit Distribution probability didn’t adequately model the various outputs. “So, I got creative and segmented the data so that each provided a ‘best fitting’ curve, then added them together to create a single, ‘best fitting’ curve,” said Nersesian.
Once the variables for wind and solar power output were understood, Nersesian needed to model the fluctuations of electricity supply and demand for fossil fuels and nuclear plants, using data inputs including both base and peak power loads, as well as seasonal adjustments. He also used RISKOptimizer to understand the desired capacity of natural gas plants. However, this still wasn’t enough to create a reliable planning methodology.
For the final step, Nersesian needed to model electricity storage requirements to determine sufficient storage for utilities to transform solar and wind to a reliable source of energy. This required data relating to the design of a pumped storage plant, including the height of the upper reservoir above the turbines, as well as the depth and surface area of the reservoirs.
The final results of this research demonstrated that uncontrollable, renewable energy sources such as solar and wind can be successfully integrated into large power systems without impacting system stability, if appropriate electricity storage via pumped storage plants is available.
“Maintaining system stability where solar and wind play a significant role in generating electricity is a growing challenge facing utility operators,” said Nersesian. “Hopefully this will spur interest in the described methodology for planning purposes when utilities think about making major investments in renewable power sources.”
Updated: May 2, 2024
Originally Published: Oct. 28, 2022
The US Government spent $24 billion on farm programs in 2000. The benefits were distributed through a variety of income-enhancing and risk-reducing policies. For instance, some programs provided price protection, others a direct subsidy, and others subsidized the purchase of crop insurance.
How do these programs individually and collectively reduce agricultural risk? Are these programs having the desired effects? Is the money helping producers reduce risk and thereby providing a disincentive to purchase crop insurance? And are there better ways to address agricultural risk?
Researchers at Cornell University and Purdue University recently completed a study that assesses the impacts of US farm programs on farm returns and answers some of the above questions. The researchers used Lumivero's (previously Palisade) @RISK to create a simulation model that illustrates the effects of government policy tools on farm incomes.
To assess the impacts of government policy tools on farmland risk and return the researchers developed an economic model for farm income and expenses based on a representative parcel of land and crop mix. @RISK allowed the researchers to model the uncertainties associated with crop yield and price. After running base-line simulations, the researchers added the individual farm programs into the model to determine their impacts. Finally they combined all the programs and crop insurance into the model. They compared the simulation outcomes to determine the impact of the various payment/subsidy mechanisms.
The @RISK simulation model demonstrated that a combination of all government programs would raise average farm incomes by almost 45%. Additionally, the programs would reduce the economic risks associated with farming by half. Most importantly, the model allowed researchers to examine how the programs interact with one another to alter the return distribution.
Producers must make decisions regarding cash rental bids, crop mixes, and even farmland purchases based upon expected returns. These decisions are based on the expectations for market prices, yields and costs – all uncertain elements.
Assistant Professor Brent Gloy of Cornell University’s Applied Economics and Management Department was one of the researchers. “@RISK was vital to the simulation model. It allowed us to incorporate uncertainties and run random simulations on the various scenarios.” He adds, “@RISK’s ability to correlate distributions of random variables was essential to the model. Additionally, we used @RISK’s output statistics to compare the various model scenarios.”
The study quantifies how government programs impact each other and subsidized crop insurance. According to Dr. Gloy, “Our results indicate that the risk reduction provided by the standard programs significantly reduce the value that risk averse producers derive from crop insurance programs.” He adds, “@RISK was instrumental to the simulation model. It allowed us to incorporate uncertainties and correlations, and to systematically evaluate each of the farm policy tools.”
The study was recently published in the Review of Agricultural Economics. For more information about the study, contact Brent Gloy at 607-255-9822 or bg49@cornell.edu.
Updated: May 2, 2024
Originally Published: Oct. 28, 2022
AgustaWestland designs, develops and manufactures helicopters. The process for bringing a new product from initial idea to market is long and complex, and therefore involves many uncertainties. The company uses @RISK to develop financial business cases and feasibility studies to produce in-depth analysis so that the senior management team can make informed decisions about which products to develop.
AgustaWestland is an Anglo-Italian helicopter company owned by Italy’s Finmeccanica. It provides rotorcraft systems design, development, production and integration capabilities, along with in-depth training and customer support, to military and commercial operators around the world, offering the widest range of advanced rotorcraft available.
The Risk Assessment, Feasibility Analysis and Business Cases department is responsible for supporting the decision-making process for key company initiatives and opportunities from the initial stages of verifying their feasibility. This requires the development of structured financial business cases to justify investment against economic returns, and to monitor results.
The department develops AgustaWestland’s risk assessment methodology, procedures and tools for the new products in line with international best practices. It then ensures that these are applied and used consistently to determine all possible outcomes when the company is evaluating opportunities.
New helicopters require large investments in order to design, develop, test, certify and bring the product to market – a process that can last three to five years. The Risk Assessment, Feasibility Analysis and Business Cases team uses @RISK (part of the DecisionTools Suite risk and decision analysis toolkit), to undertake risk analysis to determine the financial feasibility of developing any new product, preparing a financial business case for approval by the company and its shareholders.
Previously the department worked only with what it calls a ‘deterministic’ Excel model. The model’s inputs include: non-recurring costs such as engineering studies for the design and development of the new product; prototype manufacture; flight tests; and certification. Other inputs considered include: the number of helicopters planned for manufacture per annum over a 20-year period; recurring helicopter unit costs per system / subsystem; unit prices for different helicopter configurations; the elasticity curve (to show how a change in price affects demand); and the spare parts business model (the purchaser of a new helicopter will also need to buy replacement parts during its lifecycle, estimated at around 20 years, and this is built in to profitability and cash flow). Financial parameters such as inflation, the weighted average cost of capital, bank interest rates, and exchange rates in different currencies are also accounted for.
The deterministic model provides financial outputs such as revenue, Earnings Before Interest and Taxes (EBIT), net profit, Net Present Value (NPV), Internal Rate of Return (IRR), financial break-even, etc., and shows how these vary depending on specific input values.
The deterministic model can help to predict a single set of results for the main model outputs. However the economic situation cannot be predicted with any great accuracy, especially when the business cases are based on a period of 20 years. In these situations, after evaluating the achievable results with the deterministic model, it is crucial to also take uncertainty into account. This makes it possible to evaluate in advance how changes to the inputs will impact key financial outputs and, with this insight, implement mitigation actions.
@RISK allows the team at AgustaWestland to apply different probability curves, including triangular, pert and normal, to the inputs of the model. The Monte Carlo analysis enabled by @RISK provides a better view of the model itself – i.e. enables them to determine the accuracy of the forecasts and the way to improve the business both in true feasibility and in financial results.
Using the tornado graphs generated by @RISK, AgustaWestland can see which inputs have the greatest effect on the financial outputs, and therefore require more attention. For example, specific discussions with the engineering department to determine mitigating actions could potentially keep recurring costs under control if the tornado graphs show that these have the greatest impact on the financial outputs presented in the business case.
Francesca Schiezzari
AugustaWestland
“Working with such long timeframes is a key challenge because it is not possible to know for certain many of the parameters (inputs) that we use to determine the financial business case of a new product,” explains Vittorio Maestro, Head of Risk Assessment, Feasibility Analysis and Business Cases at AgustaWestland.
“Our use of the risk analysis element of Palisade’s DecisionTools Suite has enhanced our ability to assess, control and drive company decisions. We can now focus on the key activities that enable us to pursue the best product within the most appropriate financial timeframe,” adds his colleague Francesca Schiezzari, who uses @RISK to build similar financial business cases for a variety of company projects, including a Helicopter Training Centre and a Logistics Support Centre.
Distributions used:
Key software features useful to AgustaWestland:
Examples of output graphs:
Examples of input graphs:
Infectious disease is an important cause of lost production and profits to beef cow-calf producers each year. Beef producers commonly import new animals into their herds, but often do not properly apply biosecurity tools to economically decrease risk of disease introduction. Dr. Michael Sanderson, a professor of Beef Production and Epidemiology at Kansas State University’s (KSU) College of Veterinary Medicine, wanted to address this issue by developing a risk management tool for veterinarians and beef cow-calf producers to assist in identifying biologically and economically valuable biosecurity practices, using @RISK.
The college was established in 1905, and has granted more than 5,000 Doctor of Veterinary Medicine degrees. Departments within the College of Veterinary Medicine include anatomy and physiology, clinical sciences, diagnostic medicine, and pathobiology. The college's nationally recognized instructional and research programs provide the highest standards of professional education. A rich, varied, and extensive livestock industry in the region, a city with many pets and a zoo, and referrals from surrounding states provide a wealth of clinical material for professional education in veterinary medicine.
Reproductive disease is an important cause of lost production and economic return to beef cow-calf producers, causing estimated losses of $400 to $500 million dollars per year. Because of the complex nature of the production system, the biologically and economically optimal interventions to control disease risk are not always clear. Dr. Sanderson and his team (including Drs. Rebecca Smith and Rodney Jones) utilized @RISK to model the probability and economic costs of disease introduction and the cost and effectiveness of management strategies to decrease that risk.
“For this project, @RISK was essential to model variability and uncertainty in risk for disease introduction and impact following introduction, as well as variability and uncertainty in effectiveness of mitigation strategies,” said Dr. Sanderson. “Further, @RISK was crucial for sensitivity analysis of the most influential inputs to refine the model and to identify the most important management practices to control risk. It was also valuable to aggregate results into probability distributions for risk and economic cost over one-year and ten-year year planning periods.“
The project modelled the risk of introduction of the infectious disease Bovine Viral Diarrhea (BVD) into the herd, the impact of disease on the herd (morbidity, mortality, abortion, culling, lost weight) and economic control costs. These risks were aggregated over ten years to identify the optimal management strategy to minimize cost from BVD accounting for both production costs and control costs.
Probability distributions included:
Target probabilities were utilized to produce the probability of exceeding a certain cost over one and ten years, provide this data as a single number for each management option and generate descending cumulative probability distributions for exceeding any particular cost value.
As a result of the risk identification insight gained from the research, Dr. Sanderson and his team were able to improve disease management and controls by identifying:
“Our utilization of @RISK gave us the ability to account for complex aggregation of inputs and their variability and uncertainty to produce full-outcome probability distributions for more informed decision making,” said Dr. Sanderson. “Further, the ability to use research data from multiple parts of the beef production system and combine those results into a model that accounts for the complexity of the production systems allows recognition of emergent phenomena and decision making based on the full system, rather than only one part. The flexibility to customize outputs provided the most valuable information for decision making.”
The University of Victoria (UVic), a national and international leader in many areas of critical research, participated in a study funded by Health Canada that looked at the human exposure to carcinogens in various demographics. The UVic team used @RISK, Palisade’s risk analysis software, to model the differences in Lifetime Excess Cancer Risk (LECR) for Canadians based on contaminants found in food and beverages. The results revealed notable differences in cancer risks for several different demographics, and are detailed in the thesis, Geographic Exposure and Risk Assessment for Food Contaminants in Canada, by Roslyn Cheasley, a Master’s student with the Department of Geography at UVic.
The University of Victoria is a public research university in British Columbia, Canada. Ranked one of the top 250 universities in the world, UVic is a national and international leader in many areas of critical research, offering students education that is complemented by applied, clinical and work-integrated learning opportunities. The University participated in a study funded by Health Canada that looked at the human exposure to carcinogens in various demographics.
While news headlines regularly report on acute health issues relating to food and beverages, such as E. coli outbreaks and salmonella poisoning, very little is known about the adverse health issues caused by the longer-term intake of contaminants in those foods and beverages – including carcinogens. The CAREX Canada Project, funded by the Canadian Partnership Against Cancer, was launched to better understand the environmental and occupational exposures to substances associated with cancer, and subsequently provide support for exposure reduction strategies and cancer prevention programs. "The goal of the Project was to analyze all publicly available data and build a website that provided local and regional communities with tools to help determine if their geographic areas were at risk," explains Roslyn Cheasley, a Master’s student with the Department of Geography at the University of Victoria. "While the site was launched in 2012, they were concerned that by 2014, the data was already out of date. The University of Victoria made up part of the team that undertook a new study to update the information, and ensure that health officials and other decision makers had all the information they might need to indicate if there could be future health problems."
The UVic team focused on the environmental aspects of the study, looking at potential exposure to carcinogens via air, dust, water, food and beverages. They reviewed 92 different substances that were considered carcinogenic, probably carcinogenic, or potentially carcinogenic. These were then narrowed down to five substances specifically for the food and beverage study: arsenic, benzene, lead, PCB (polychlorinated biphenyls) and PERC (tetrachloroethylene). "Up to this point in time, all analysis had been done from a deterministic point of view, which wasn’t particularly helpful as it didn’t enable us to understand the full range of potential contamination and which populations were more or less at risk," said Cheasley. "We decided to take things up a notch when we updated the data, and upgrade to a probabilistic analysis model based on Monte Carlo simulation. We wanted to estimate the range and frequency of possible daily contaminant intakes for Canadians, as well as associate these intake levels with lifetime excess cancer risk. This is where @RISK came into the equation."
Palisade’s @RISK enabled the team to easily and effectively determine the concentration of carcinogenic elements in the identified food and beverage products, as well as learn if certain demographics were more at risk from dietary patterns than others.
The first challenge to building the new model was pulling together all existing information, as elements of the data were in different formats (e.g. Excel, Access and Stata), as well as in different physical (offline) locations. Then the team had to manage the vast quantity of that information: the resulting 1.5 million rows of data was too much to easily manipulate, sort and manage without corrupting the results.
The next challenge related to the data for the food and beverage types. The team had analyzed the dietary patterns of approximately 35 thousand Canadians, using three different categories: geographic location, gender and income levels. They’d also identified 60 whole foods for the model, from eight food groups: meat, fish, dairy, fruit, vegetables, rice/cereals, grain/nuts and beverages. However, the data for these specified foods came from three different sources, with each using a different form of measurement. According to Cheasley, “The problem we had was how to bring all of these components together in a way that would provide a comprehensive but usable outcome. We needed to be able to filter the data into different dietary patterns as well as different demographics, then marry it each time with the five different carcinogenic substances."
Palisade's @RISK software solved these problems, enabling the team to use PERT distributions to easily determine the minimum, mean and maximum concentration of the five carcinogenic elements in the identified food and beverage products. They were also able to see the output of the different dietary patterns and determine if certain demographics were more at risk than others. “I really appreciated how easy @RISK was to use – I didn’t need to be a statistician to understand it," said Cheasley. "Plus I loved the instantaneous flexibility. If I needed to run a new simulation, the results were immediately visible – and easily understandable – in a graph or chart.” For this study, each of the 125 different simulations was run 50 thousand times, to ensure the most accurate results (and smoothest possible graphs).
The outputs of the @RISK model revealed to the UVic team that of the five tested contaminants, arsenic showed the greatest difference between urban and rural estimated Lifetime Excess Cancer Risk (LECR). In addition, LECR was estimated to be higher for men vs. women in Canada for all five contaminants, with an emphasis on males in British Columbia from the dietary intake of arsenic. When based on income level, the model predicted LECR being higher for low and middle incomes from the dietary intake of arsenic, benzene, lead and PERC. However, high-income populations were more likely to have higher LECR from the dietary intake of PCBs.
"I hope that local health officials will be able to use the results of this model to determine if they should do a more detailed study in their own particular regions. For example, what are males eating in British Columbia that impacts their dietary intake of arsenic, and is there a real risk of arsenic in specific foods," added Cheasley.
At first glance, the Fort McMurray Airport in Fort McMurray, Alberta, Canada would appear ordinary; it is a small airport authority, comprising a single runway, small terminal and serving a rural region removed from major metropolitan centers. However, this airport is the gateway to Canada’s oilsands and faces a number of unique challenges related to servicing one of the largest industrial construction projects in the world today. The unique challenges of the Fort McMurray airport include: unprecedented passenger growth, staffing constraints, infrastructure constraints, pressures from various stakeholder groups, the introduction of daily international flights, shifts in politics and even risks posed by the potential development of oil reserves beneath the airport site. Thus, the Fort McMurray Airport Authority (FMAA) turned to Revay and Associates Ltd. to assess the potential risks they face as an organization as they try to keep pace with the growth of the region, and Revay turned to @RISK to help with this analysis.
Dr. Mark Krahn, consultant at Revay and Associates, knew that the Fort McMurray Airport (FMA) was an unusual case when the Revay risk team tackled it. “It’s a small town airport in a city that has doubled in population in the past decade.” This jump in population is thanks to the Athabasca Oilsands, the second-largest oil deposit in the world after Saudi Arabia. These oilsands represent recoverable reserves of 170 billion barrels and estimated total reserves of 1.8 trillion barrels of this essential energy source. Most major oil companies, along with the accompanying industries and contractors, have rushed to take advantage of this opportunity. As a result, Fort McMurray has become a boomtown with skyrocketing house prices, low unemployment rates, accommodation shortages and high salaries.
The oil boom has accelerated air traffic into the area through increased “fly-in-fly-out” traffic of camp-based workers as well as the increasing local traffic. As a result, the FMA is the fastest growing airport in Canada. In 2012, the FMA had a record passenger throughput of approximately 1 million passengers. With an annual capacity of only 250,000 passengers in the existing terminal, the FMA desperately needed to expand. Plans for a new terminal, with a capacity of 1.5M passengers began in 2010. The new $250M terminal is currently wrapping up construction and is set to open in Spring 2014.
Due to the unique context of the FMA and its future direction, a number of risk factors needed to be considered around the expansion project, as well as around the success of the organization as a whole.
Revay and Associates Ltd. is a consulting firm that specializes in risk management, contract strategy, conflict management and overall project management. Initially Revay was engaged to lead the project risk assessment for the FMA expansion project.
The project risk assessment was focused specifically on the capital cost and schedule uncertainty of the new terminal construction project. Subsequently, Revay was asked to lead the enterprise risk management (ERM) assessment of the FMA. According to Krahn, formal ERM is a relatively new management discipline that has evolved over the past few years and includes methods and processes used by organizations to manage risks and seize opportunities related to the achievement of their objectives and corporate strategy. “ERM is much broader than project risk,” says Krahn. “Clients must first identify what the strategy of the organization is, what their mission is, and what their key success drivers (KSDs) and objectives are. In order to assess the risk, it is imperative that the organization be clear on what these KSDs and objectives are, and then we can determine the risks impacting their success.”
The identified enterprise risks and opportunities are often categorized according to a number of key areas, including:
*Operational risk *Reputational risk *Strategic risk *Personnel Safety and Health risk *Financial risk *Environmental / Containment risk *Productivity / Morale risk
“One of the biggest challenges of ERM is the many different categories or ‘buckets of risk’, says Krahn. “Senior Management needs to understand what the top overall risks are in order to implement effective mitigation actions and to understand the overall risk exposure. This poses an apples-to-oranges conundrum, as there may be several high-level risks in various categories making it difficult to draw comparisons between risks. Being able to compare risks across the different categories is critical to understanding what the organization’s top risks truly are and to focus the organization in mitigating the risk.”
To address this dilemma, Revay’s approach to the ERM assessment at FMAA had two novel aspects:
1. Martin Gough, Revay’s Risk Practice Lead, developed a methodology to allow for direct comparison between risks of different categories. Although risks could still be classified in various categories, a common impact currency, termed “Utils” or Utility, was used in the risk evaluation to allow for direct comparison between risks. 2. Rather than the more limited descriptive or qualitative nature of ERM, Revay applied quantitative techniques using @RISK to decipher more informative probabilistic risk details.
Dan Daitchman
Director at B. Riley Advisory Services
The FMAA has a strong leadership team and had developed a comprehensive strategic plan prior to completing the ERM assessment, including; vision, mission, values, Key Success Drivers (KSDs) and 5-year rolling goals. A collateral document, the annual Corporate Business Plan, outlines one-year corporate objectives and performance indicators. FMAA has identified four key KSDs and each is assigned a weight (percentage) factor:
*Optimized Customer Experience (40%) *To Lead a High Performing Airport Team (25%) *To Achieve Environmentally Responsible, Sustainable and Profitable Growth (20%) *To Foster Effective Stakeholder Relationships (15%)
Each of the four KSD areas has a series of specific and related objectives, each objective with its own sub-weighting.
In order to assess the enterprise risk and opportunity around each of the KSDs and their specific objectives, Revay facilitated off-site workshops with attendance by all of the key stakeholder groups, including the FMAA board of directors, FMAA administration and operations personnel, local government, provincial government, airlines, funders, insurer, various FMAA consultants, and representation from the expansion project management team. “Having good representation from all key stakeholder groups is critical to the success of the ERM assessment,” Krahn explains.
As part of the workshop, Revay presented and developed the risk scoring matrix with the attendees. By doing so, the workshop attendees had direct input into the matrix and learned how it was to be applied in the risk evaluation to ensure consistency of process. As this was a non-standard application methodology, this approach proved to be invaluable.
There are two variables that are to be evaluated for each risk and opportunity: probability and impact. However, in this application of ERM the impact is on a single scale as measured by Utils, rather than the various risk category impact descriptors as is common in traditional ERM. Each KSD area was provided with an initial credit of 10,000 Utils. In the workshop the teams then protected this balance through risk assessment, reduction and mitigation and improved the balance through opportunity identification and capitalization planning. The percent or Util impact evaluation is the direct reduction to the percent weighting of the individual Corporate Objectives and KSDs. Those risks with a higher detrimental impact to the objectives will be scored with a higher impact than those with a lower detrimental impact.
@RISK was used to model the uncertainty in both the probability scale and impact scale as determined by the scoring of each risk / opportunity. Instead of a single probability x impact result for each identified risk, @RISK allowed for the probabilistic range of outcomes to be determined for each identified risk (S-curve). Revay applied the “trigen” distribution (modified triangular) to model the range of both probability and impact for each risk. This quantitative information is much more informative for comparing risks, comparing KSDs, comparing categories of risk and assessing the overall ERM risk register.
There are two key results that come from this unique approach to ERM. First, using quantitative Monte Carlo assessment allows for probabilistic risk results. This information is important as it ties the uncertainty to the probability / confidence level. In addition to the mean (or P50) result, the entire range of results is known with the associated probability values (i.e. 90% or 10% confidence level etc.). Second, assessing the risk on a single Util scale allows for direct comparison between risks and between KSDs. This is critical as it allows management to focus on risk mitigation and response actions for those risks that are the highest rated overall.
1. There are two figures that are typical outputs of this type of assessment: Probabilistic S-curve showing the risk profile (pre- vs. post-mitigation action) vs. probability value (percent confidence). This comparison shows the benefit of risk mitigation.
2. Risk dashboard created by Revay to track and communicate the change in risk profile at quarterly intervals. This high-level view quickly allows an understanding of the risk trends, areas of highest concern and impact of mitigation.
In conclusion, the key benefits of this ERM methodology used to assess the risk around FMAA’s corporate strategy are (a) the confidence and understanding around the top individual risks impacting the organization, (b) an awareness of the top objectives and KSDs that are at highest risk, (c) an understanding of the risk trends over time, with and without mitigation, and in a probabilistic manner, and (d) a dynamic process that can be easily adjusted as an organization changes course in tune with the external environment in which it exists. Only through the use of @RISK and applying novel quantitative techniques was this achievable.