Application of @RISK’s Monte Carlo Distribution in Excel for Renewable Energy Risk Modeling

When building a manufacturing facility, a reliable and resilient power source is key. As social and governmental pressure to decarbonize the manufacturing sector intensifies, more companies are moving away from fossil fuel-powered plants – for example, a 2024 article in Automotive Manufacturing Solutions notes that major automobile manufacturers in Europe, Asia, and North America “continue to innovate and adopt green energy practices,” such as plants powered in whole or in part by solar, wind, and hydroelectric energy.

In addition to addressing sustainability, renewable energy can also result in more cost-effective operation in a time of global energy supply turbulence. A March 2024 article in the Engineering Science & Technology Journal observes that “[t]he integration of renewable energy into the manufacturing sector is not just a step towards mitigating environmental impact but also a strategic move that opens up a wealth of opportunities for innovation, competitiveness, and growth.”

However, making the switch to sustainable energy sources comes with its own set of risk factors. In a recent Lumivero webinar, “Illuminating Probabilistic Risk in Renewable Energy,” energy industry consultant Manuel Carmona walked attendees through modeling methods that can help evaluate the different types of operational and financial risks for renewable energy products in manufacturing. In this case study, we’ll discuss the highlights of Carmona’s presentation, define and describe how to use Monte Carlo simulation, and present the risk modeling examples used to make better decisions in the field of renewable energy.

Key Points

""[@RISK] significantly enhances decision-making by enabling experts, practitioners, and professionals to swiftly construct probabilistic models using Monte Carlo simulation, effectively analyzing risk, exposure, and the impact of uncertainties in every business decision.""

Manuel Carmona, MBA, PMI-RMP®

Using Monte Carlo Simulation to Determine the Feasibility of Construction of a Renewable Energy Plant for a Manufacturing Company

Background

Manuel Carmona is a certified Project Management Institute Risk Management Professional (PMI-RMP) with more than 25 years of experience in managing projects within the energy and technology sectors. As a trainer with EdyTraining, he has helped manufacturers utilize @RISK and XLSTAT to run Monte Carlo simulations using Excel – simulations that can be used for various types of probabilistic analysis.

Probabilistic analyses can be used to answer a wide range of questions raised at the outset of a renewable energy project, including:

To generate these analyses, Carmona recommends building models using @RISK by Lumivero, a probabilistic risk analysis tool that lets you create Monte Carlo simulations while using Excel spreadsheets.

What Is Monte Carlo Simulation?

Monte Carlo simulation is a statistical analysis technique first developed by scientists working on the Manhattan Project during World War II. It’s used to create probabilistic forecasts that account for risk and random chance within complex systems. Finance, meteorology, insurance and defense are just a few of the industry sectors that make use of Monte Carlo simulations to inform decision making.

Powered by software such as @RISK, Monte Carlo simulation can quickly generate thousands of simulations using random numbers that account for a wide range of variables, generating many different outcomes along with the probability of their occurrence.

Slide from Carmona’s presentation. The text reads ’Monte Carlo Simulation. We use probability distributions to drive the collection of random number samples. Thousands of possible scenarios and their probability of occurrence are calculated in just a few seconds. We get advanced analytics and features like tornado plots, scatter plots, input-output sensitivities, correlation effect, stress analysis, scenario analysis etc.’

Advantages of Monte Carlo Simulation Over Other Techniques

Creating probabilistic analysis models with a Monte Carlo add-in for Microsoft Excel is typically a simple process that generates a complete range of possible values as opposed to traditional deterministic modelling techniques.
Most analysts use single-point estimates (also known as mean values or most likely values) for their estimations, then perform a series of best- and worst-case scenario calculations using formulas to determine the impact of a specific variable on a project.

For example, an analyst might begin their calculations by setting the cost of building an energy plant as high as estimates indicate it will go, generate an output, and then work in increments to gradually define potential impacts of a project. Manually adjusting the parameters for each calculation allows for refinement of the outcomes, but it cannot produce a complete range of potential outcomes.

With Monte Carlo simulation, analysts can develop comprehensive risk analyses more quickly – analyses that project risk into the future to determine whether an investment is worth making. These analyses can also be adjusted to model many different types of risk or uncertainty including cost assessments across the life of a project.

@RISK – Monte Carlo Simulation in Excel Spreadsheets

With @RISK, project managers can build models and run thousands of simulations under different scenarios – allowing them to quickly model probabilities across a wide range of variables. Plus, the interface allows for rapid generation of graphics that help stakeholders visualize findings. Options include tornado graphs showing advanced sensitivity analysis, stress testing, scatter plots, and more.

Carmona notes that because @RISK integrates with Microsoft Excel, creating a probabilistic analysis is as simple as selecting any other Excel pre-programmed function – making the creation of models a straightforward process. By integrating these various uncertainties into a comprehensive @RISK model, project managers can perform Monte Carlo simulations, running thousands of iterations to assess a project's financial performance under different conditions and scenarios.

This approach provides valuable insights to project stakeholders into the range of possible outcomes, the probability of meeting certain financial targets, and the identification of critical risk factors that may significantly impact the project's success and objectives.

The Project – A Hybrid Energy Plant for a Manufacturing Facility

Carmona demonstrated how @RISK could be used to analyze uncertainties and costs for building a renewable power plant for a manufacturing facility. The model plant would utilize solar panels and wind turbines to generate energy and would need to reliably produce eight to 12 megawatts (mWh) of energy per day.

For the purposes of this exercise, Carmona assumed that the plant was well-sited and that its solar panels and turbines were appropriately sized. The first question to answer was: based on probabilistic analysis, how much power would the plant usually generate in a given day?

Resource Uncertainty Assessment

To begin answering this question, it was necessary to develop models that incorporated different types of uncertainty. The analysis began by looking at the solar plant. Three variables that could impact energy generation include:

Using power output data from the solar panel manufacturer and weather data for the city in which the plant was to be built (Madrid, Spain), Carmona used @RISK to generate a distribution curve for power output based on solar irradiation. On a completely cloudless day, the plant could be expected to produce 12–13 mWh of power during daylight hours. Given typical weather conditions at the site, what would the power output of the plant most likely be?

Slide from Carmona’s presentation showing the @RISK distribution for the estimation of solar output.

Carmona used the @RISK add-in to simulate a dynamic environment with cloud cover that changed throughout the day with Monte Carlo simulation using random variables. Before running the simulation, he defined the cloud cover using a standard distribution. This required some adjustment to ensure that the model did not generate cloud cover values greater than 100%.

Carmona’s simulation in @RISK of a dynamic environment with cloud cover.

Cloud cover was not the only variable to account for, however. Temperature impact the power output of a solar cell as well – the higher the temperatures, the lower the output. While a perfectly cloudless day should result in maximum power output, an exceptionally hot day can actually impair generation. The model therefore needed to account for temperature correction.

Carmona’s simulation in @RISK of a dynamic environment with cloud cover and temperature correction.

Advanced Techniques and Strategies for Using Monte Carlo Simulation

Since @RISK utilizes the Latin hypercube method, a statistical sampling technique that cuts down on computer processing time, Carmona was able to quickly run 100,000 Monte Carlo simulations. A plot of the results was generated, resulting in a probability distribution curve for the daily production of electricity. The simulation projects that on 47.9% of days, energy production would be within the desired range between 11 and 13 mWh.

Probability distribution in @RISK showing the daily production of electricity.

Probability of Equipment Performance and Reliability Analysis

The next variable to account for was equipment. Power generated by renewable energy means still needs to travel from where it is generated to the facility it needs to power, and sometimes external equipment, such as electrical transformers, may fail. There are also other types of failure events to consider such as damage to solar panels from hailstorms, module soiling, and vegetation overgrowth. The next stage of the probabilistic analysis process was to model for these types of external failure events.

Carmona allowed for four failure events – one that happens every 250 days of operation, one that happens every 500 days, 750 days, or 1,000 days. In every instance, a failure event means that the plant does not provide any activity for the rest of the day.

@RISK simulation showing the four failure event scenarios.

Using @RISK, Carmona ran a second series of simulations that accounted for external failure events, then generated a new distribution graph to show how these new variables could further impact the probability that the plant would produce enough power on a given day. The red bars indicate the original simulation; the blue the second simulation. This second visualization clearly shows that failure events were likely to reduce the probability of the plant producing enough energy by 3%.

@RISK Plant Availability Comparison model showing that most days the solar plant would be able to produce around 9 MWh of power.

The model shows that on most days the solar plant would be able to produce around 9 MWh of power on most days.

Modeling Combined Wind and Solar Energy Using Random Variables

So far, the probabilistic models indicated that the solar plant would not be able to produce enough electricity to meet demand on most days. However, adding wind generation would allow the factory to charge storage overnight when the plant was not producing.

Modeling the power outputs for the wind plants followed a similar process to modeling solar output. This process included gathering weather data about average wind speeds in the area where the plant was to be built and manufacturer data about how much power the turbines generate at a given wind speed. The chart below shows what percentage of output the wind turbines generate (Y-axis) given a specific wind speed (X-axis). Note that faster wind speeds will actually hinder power generation after a certain point, just as a day that is sunny but also very hot will reduce the effectiveness of solar panels.

Model showing the percentage of output the wind turbines generate given a specific wind speed.

After running simulations for the wind plant, Carmona was able to demonstrate that the combination of both generation methods – wind and solar – had a high probability of meeting electricity demand on most days.

Graph showing the power output and wind speed in addition to the combined output of solar and wind for 12 hours which totals 11.898 MWh.

That accounts for the actual power generation issues. What about the costs of operating and maintaining the plant?

Using @RISK Results to Determine Net Present Value (NPV)

Carmona decided to conduct a NPV analysis comparing the lifetime cost of operating the renewable energy plant with performance-monitoring software involved. Without the monitoring software, the plant performance would be lower than with the monitoring software. With the software, the plant would produce approximately 6% more energy. Does licensing and operating the monitoring software result in actual saving over time?

Table showing the cost analysis of licensing and operating performance-monitoring software for the renewable energy plants.

The table below was used to generate a 13-year forecast that also accounted for estimated plant inspection and maintenance costs, which would take place every three to four years.

Table showing CAPEX cash flow for the estimated plant inspection and maintenance costs.

Then, Monte Carlo analysis was performed to generate an average NPV. This showed that the average NPV of the plant without the monitoring software would be €134,000, while the average NPV with the monitoring software would be approximately €169,000 over the same period.

Graph and simulation models showing NPV of the plant with and without the monitoring software.

The result: running the plant with monitoring software would result in an average savings of €35,300.

What about the risks and costs involved with building the plant? Fortunately, DecisionTools Suite’s ScheduleRiskAnalysis allows project managers to assess time and cost uncertainty. The program can import details of projects that have been scheduled in either of two popular project management tools: Microsoft Project or Primavera P6 from Oracle. Project managers can use @RISK to import their project schedules into Excel and carry out Monte Carlo simulation to determine the impact of construction delays or cost overruns.

One Program, Endless Solutions

For renewable energy projects, @RISK empowers project managers and decision makers to make informed choices by generating Monte Carlo distributions in Excel. From determining power output to evaluating the value of investing in add-ons like monitoring software, @RISK can help you develop robust probabilistic analyses for a variety of risks and provide clear visualizations of results.

Get Started with @RISK in Microsoft Excel

Find out how you can generate better risk analyses for your renewable energy projects – or any other projects – within Microsoft Excel. Request a free trial of @RISK today. You can also watch the full webinar on-demand!

SOURCES

Energy Risk Modelling, Roy Nersesian.

Manuel Carmona, Edytraining Ltd.

To address complicated debt portfolio and capital investment problems, Dr. Emmanuel Donkor uses Lumivero software @RISK and RISKOptimizer. His research has led to new and better methods for addressing financial statistical problems, using @RISK.

When multiple sources of debt with different financing terms are available to the decision maker–for example, a ten-year loan with 5% interest, and a 5-year loan with 10% interest–the environment is fraught with uncertainty. “The challenge is to determine an appropriate or optimal mix of these different debt sources that simultaneously maximizes project value and reduces the risk of default,” explains Dr. Emmanuel Donkor, a quantitative analyst and faculty member of the School of Engineering and Applied Sciences (SEAS) at George Washington University. To address complicated debt portfolio and capital investment problems, Donkor uses Lumivero software, previously Palisade, @RISK and RISKOptimizer. His research has led to new and better methods for addressing financial statistical problems, using @RISK.

Dr. Donkor used Lumivero software tools @RISK and RISKOptimizer to conduct two separate research projects published in the journal the Engineering Economist. The first tackled the improvement of debt portfolios for financing capital investment plans, while the other project empirically tested stochastic dominance relationships in risky capital investments.

Diversifying Debt

In his first project published in the Engineering Economist, Donkor, along with Associate Professor Michael Duffey, addressed the question of how project promoters, who wish to use project finance as a procurement vehicle, can choose an optimal debt portfolio. The aim for these promoters is to develop a portfolio that maximizes project value but minimizes default risk when project cash flows are uncertain, and when debt with different financing terms can be sourced from multiple sources. For example, the promoter may face the previously-mentioned situation of multiple sources of debt with different financing terms—such as a ten-year loan with 5% interest, and a 5-year loan with 10% interest. “It is a difficult problem because the environment is fraught with uncertainty,” says Dr. Donkor.

To address this decision problem, Dr. Donkor and Dr. Duffey used @RISK and RISKOptimizer to help analyze and then recommend an appropriate mix of different debt instruments for financing a capital investment project. “RISKOptimizer allows the analyst to incorporate what’s known as probabilistic or chance constraints—this ensures that the risk of defaulting on loan payments, in any given period of the loan tenure is limited to say 5%,” says Dr. Donkor. They first developed a stochastic financial model in Excel, and used RISKOptimizer’s simulation optimization capability to select an optimal mix of fixed-rate debt instruments such that default occurred no more than 5% of the time. They then used @RISK simulation to evaluate the performance of the debt policy prescribed by the optimization model.

This new approach improves on the traditional methods used in risk analysis in capital investment planning, in which analysts would “take the value of debt as given, and simulate it without taking into consideration the probability of default on debt service each year,” says Dr. Donkor. “Furthermore, analysts do not consider multiple sources of debt with different financing terms—you hardly ever see debt portfolios with different terms and interest rates—it’s usually homogenous, with debt having one tenure and one interest rate.” Dr. Donkor and Dr. Duffey’s new approach, which shows how to model chance constraints in a spreadsheet environment, and implement it with RISKOptimizer, has sparked interest in the field. The paper detailing their work has become one of the highest ranked articles in the Engineering Economist.

Figure 1: The profile of excess earnings after debt service, indicating that the chances of defaulting on debt service are well contained.

Figure 2: Plot of excess earnings after interest coverage.

"RISKOptimizer allows the analyst to incorporate what’s known as probabilistic or chance constraints. This ensures that the risk of defaulting on loan payments, in any given period of the loan tenure is limited."Dr. Emmanuel Donkor
School of Engineering and Applied Sciences (SEAS), George Washington University

Ranking Opportunities

Dr. Donkor also addressed the problem that analysts face when they must choose between multiple, risky, mutually exclusive capital investments. He did this by creating a spreadsheet framework that uses @RISK to implement empirical tests of stochastic dominance—a term used in decision theory which describes the ranking of random prospects based on preferences regarding outcomes. As a result of Dr. Donkor’s work, analysts involved in comparing risky capital investments do not have to rely on qualitative and visually-based ‘best guesses’.

This solution benefits those who are faced with investment choices in which only one option can be selected. For example, a person owns a building, and has to decide whether to rent it out as residential apartments or as a factory. “You can’t do both at the same time,” says Dr. Donkor, “so you have to choose one option.”

Typically these kinds of opportunities are compared by using decision rules based on the mean-variance criterion (selecting portfolios based on the means and variances of their returns) or safety-first criterion (setting a minimum required return for a given level of risk). However, at times, the mean variance criterion and its variants result in an efficiency frontier in which more than one investment option offers maximal expected return for some given level of risk, and minimal risk for some given level of expected return. This can make it difficult to select only one option.

“The problem becomes complicated when you have opportunity A, which gives you the highest value, but it has a high risk, and opportunity B, which will give you lower value but a lower risk,” says Donkor. “As a decision maker, you want high value and low risk, but these qualities are not always enshrined in the same opportunity.” For such problems, stochastic dominance rules, typically implemented by visually inspecting the cumulative distribution functions (CDF) of the alternatives, are applied. However, for many practical applications, it is common for the distributions to cross tails, creating what’s known as the ‘tail problem’. In these circumstances, analysts apply what’s known as ‘almost stochastic dominance’ (ASD), which allows decision makers to ignore the crossing at the tails so that dominance inferences can be made.

These approaches are inexact and lack quantitative certainty; on top of these issues, Dr. Donkor says that most analysts do capital planning problems in Microsoft Excel, but are not able to make stochastic dominance inferences in that program. “A theory has been developed, but no one has turned that theory into a code in Excel where it can be used,” says Dr. Donkor. Thus, the majority of practitioners, researchers, and students who analyze alternative capital investment plans under uncertainty in Excel are limited to using either visual inspection or ASD without any empirical support.

Dr. Donkor has improved this process with his @RISK-enabled spreadsheet framework which empirically tests stochastic dominance. Now, instead of using a visual best guess, analysts can use an @RISK model to empirically test for the best option among many, allowing them to make empirically defensible decisions when comparing risky capital investments.

Originally published: June 16, 2021
Updated: June 7, 2024

To address complicated debt portfolio and capital investment problems, Dr. Emmanuel Donkor uses Lumivero software @RISK and RISKOptimizer. His research has led to new and better methods for addressing financial statistical problems, using @RISK.

When multiple sources of debt with different financing terms are available to the decision maker–for example, a ten-year loan with 5% interest, and a 5-year loan with 10% interest–the environment is fraught with uncertainty. “The challenge is to determine an appropriate or optimal mix of these different debt sources that simultaneously maximizes project value and reduces the risk of default,” explains Dr. Emmanuel Donkor, a quantitative analyst and faculty member of the School of Engineering and Applied Sciences (SEAS) at George Washington University. To address complicated debt portfolio and capital investment problems, Donkor uses Lumivero software, previously Palisade, @RISK and RISKOptimizer. His research has led to new and better methods for addressing financial statistical problems, using @RISK.

Dr. Donkor used software tools @RISK and RISKOptimizer to conduct two separate research projects published in the journal the Engineering Economist. The first tackled the improvement of debt portfolios for financing capital investment plans, while the other project empirically tested stochastic dominance relationships in risky capital investments.

Diversifying Debt

In his first project published in the Engineering Economist, Donkor, along with Associate Professor Michael Duffey, addressed the question of how project promoters, who wish to use project finance as a procurement vehicle, can choose an optimal debt portfolio. The aim for these promoters is to develop a portfolio that maximizes project value but minimizes default risk when project cash flows are uncertain, and when debt with different financing terms can be sourced from multiple sources. For example, the promoter may face the previously-mentioned situation of multiple sources of debt with different financing terms—such as a ten-year loan with 5% interest, and a 5-year loan with 10% interest. “It is a difficult problem because the environment is fraught with uncertainty,” says Dr. Donkor.

To address this decision problem, Dr. Donkor and Dr. Duffey used @RISK and RISKOptimizer to help analyze and then recommend an appropriate mix of different debt instruments for financing a capital investment project. “RISKOptimizer allows the analyst to incorporate what’s known as probabilistic or chance constraints—this ensures that the risk of defaulting on loan payments, in any given period of the loan tenure is limited to say 5%,” says Dr. Donkor. They first developed a stochastic financial model in Excel, and used RISKOptimizer’s simulation optimization capability to select an optimal mix of fixed-rate debt instruments such that default occurred no more than 5% of the time. They then used @RISK simulation to evaluate the performance of the debt policy prescribed by the optimization model.

This new approach improves on the traditional methods used in risk analysis in capital investment planning, in which analysts would “take the value of debt as given, and simulate it without taking into consideration the probability of default on debt service each year,” says Dr. Donkor. “Furthermore, analysts do not consider multiple sources of debt with different financing terms—you hardly ever see debt portfolios with different terms and interest rates—it’s usually homogenous, with debt having one tenure and one interest rate.” Dr. Donkor and Dr. Duffey’s new approach, which shows how to model chance constraints in a spreadsheet environment, and implement it with RISKOptimizer, has sparked interest in the field. The paper detailing their work has become one of the highest ranked articles in the Engineering Economist.

Figure 1: The profile of excess earnings after debt service, indicating that the chances of defaulting on debt service are well contained.

Figure 2: Plot of excess earnings after interest coverage.

"RISKOptimizer allows the analyst to incorporate what’s known as probabilistic or chance constraints. This ensures that the risk of defaulting on loan payments, in any given period of the loan tenure is limited."Dr. Emmanuel Donkor
School of Engineering and Applied Sciences (SEAS), George Washington University

Ranking Opportunities

Dr. Donkor also addressed the problem that analysts face when they must choose between multiple, risky, mutually exclusive capital investments. He did this by creating a spreadsheet framework that uses @RISK to implement empirical tests of stochastic dominance—a term used in decision theory which describes the ranking of random prospects based on preferences regarding outcomes. As a result of Dr. Donkor’s work, analysts involved in comparing risky capital investments do not have to rely on qualitative and visually-based ‘best guesses’.

This solution benefits those who are faced with investment choices in which only one option can be selected. For example, a person owns a building, and has to decide whether to rent it out as residential apartments or as a factory. “You can’t do both at the same time,” says Dr. Donkor, “so you have to choose one option.”

Typically these kinds of opportunities are compared by using decision rules based on the mean-variance criterion (selecting portfolios based on the means and variances of their returns) or safety-first criterion (setting a minimum required return for a given level of risk). However, at times, the mean variance criterion and its variants result in an efficiency frontier in which more than one investment option offers maximal expected return for some given level of risk, and minimal risk for some given level of expected return. This can make it difficult to select only one option.

“The problem becomes complicated when you have opportunity A, which gives you the highest value, but it has a high risk, and opportunity B, which will give you lower value but a lower risk,” says Donkor. “As a decision maker, you want high value and low risk, but these qualities are not always enshrined in the same opportunity.” For such problems, stochastic dominance rules, typically implemented by visually inspecting the cumulative distribution functions (CDF) of the alternatives, are applied. However, for many practical applications, it is common for the distributions to cross tails, creating what’s known as the ‘tail problem’. In these circumstances, analysts apply what’s known as ‘almost stochastic dominance’ (ASD), which allows decision makers to ignore the crossing at the tails so that dominance inferences can be made.

These approaches are inexact and lack quantitative certainty; on top of these issues, Dr. Donkor says that most analysts do capital planning problems in Microsoft Excel, but are not able to make stochastic dominance inferences in that program. “A theory has been developed, but no one has turned that theory into a code in Excel where it can be used,” says Dr. Donkor. Thus, the majority of practitioners, researchers, and students who analyze alternative capital investment plans under uncertainty in Excel are limited to using either visual inspection or ASD without any empirical support.

Dr. Donkor has improved this process with his @RISK-enabled spreadsheet framework which empirically tests stochastic dominance. Now, instead of using a visual best guess, analysts can use an @RISK model to empirically test for the best option among many, allowing them to make empirically defensible decisions when comparing risky capital investments.

Originally published: June 16, 2021
Updated: June 7, 2024

Application

Modeling the frequency and magnitude of future debris flows to determine the optimum hazard mitigation strategy. Communicating risk to clients by displaying the probability of event paths for three decisions:

  1. Existing conditions
  2. Constructing a containment dam
  3. Relocating existing residences

Summary

Duncan Wyllie, a Principal of Wyllie & Norrish Rock Engineers, uses the Palisade software PrecisionTree for probabilistic modeling of debris flow protection measures.

When analyzing the optimum method of protecting an area at risk from debris flows, three decisions are compared – accepting existing conditions, constructing a containment dam with sufficient capacity to contain future flows, or relocating residences on the debris flow runout area. Creating probabilistic decision trees in PrecisionTree allows uncertainties in the frequency and magnitude of future debris flows to be analyzed, and for comparison of costs between constructing a dam and relocating the residences.

Background

Wyllie & Norrish Rock Engineers, with offices in Seattle and Vancouver, Canada, is a specialist engineering company working in the fields of landslides, tunnels, slopes, and foundations. Duncan Wyllie and Norman Norrish, the company principals, have a combined total of 80 years of experience in applied rock mechanics.

Since the 1990s, Wyllie and Norrish have been utilizing Palisade software to analyze natural hazards and select hazard mitigation procedures.

Using Palisade Products

When a potential debris flow hazard is located above a residential development, PrecisionTree can be used to create a probabilistic decision tree that maps out possible scenarios, the likelihood they will occur, and the estimated damage costs. Three decisions are compared – existing conditions, constructing a debris flow dam, or evacuating the debris flow runout area.

ss

"If we use @RISK and PrecisionTree to present results, people can make rational decisions as to what structural protection to install."

Duncan Wyllie
Principal of Wyllie & Norrish Rock Engineer

ss

Debris Flow Dam Decision Tree Example

With reference to the decision tree shown below, the components of the analysis are as follows:

For a closer look, download our free Debris Flow Containment Dam example model.

Analysis shows that the optimum decision is to construct a containment dam because the total cost of mitigation plus the expected cost (EV) of damage is lower for the dam construction (EVΣdam = $200,150) than for existing conditions (EVΣexisting = $360,000) or for relocating the houses (EVΣhouses = $2,000,600).

Results

The use of PrecisionTree allows possible mitigation measures, along with the probability of event occurrence and cost, to be analyzed. The analysis unambiguously identifies the most cost-effective mitigation measure, and the decision process is clearly mapped out in the decision tree.

A Competitive Edge

The use of @RISK and PrecisionTree software to prepare decision trees modeling all potential outcomes enables Wyllie & Norrish Rock Engineers to quantitatively determine the optimum protection strategy and easily communicate the findings.

With Palisade’s products, Wyllie & Norrish Rock Engineers can:

By using probabilistic analysis, Wyllie & Norrish Rock Engineers ensure that the best decision is reached for each at-risk area and if necessary, effective debris flow dams are created to protect nearby structures.

Free Example Models

Download our free example model, Decision Trees in Geotechnical Engineering, to explore three decision tree examples from the geotechnical engineering field: debris flow containment dam, rock slope stabilization, and gravity dam reinforcement anchors.

Application

Modeling rock fall masses, trajectories, velocities, and energies to design rock fall protection structures. Communicating risk to clients by calculating impact energy probability so that the impact capacity of the protection structure can be matched to the consequence of an accident.

Summary

Duncan Wyllie, a Principal of Wyllie & Norrish Rock Engineers, uses @RISK and PrecisionTree for probabilistic modeling of rock fall hazards. The analyses incorporate uncertainty in values of the mass and velocity that are expressed as probability distributions, and Monte Carlo simulation in @RISK is used to calculate the probability distribution of the impact energy. The energy calculations are then used to design protection structures such as concrete sheds and wire rope fences with the appropriate impact energy capacity to suit the possible consequences of an accident. Decision analysis using PrecisionTree is then applied to determine the optimum mitigation strategy.

In the example, Wyllie explains how Palisade software is used to calculate potential rock fall impact energies and make recommendations for a fence to protect traffic at the base of a steep mountain slope.

Background

Wyllie & Norrish Rock Engineers, with offices in Seattle and Vancouver, Canada, is a specialist engineering company working in the fields of rock slopes, tunnels, blasting, foundations, landslides, and rock falls. Duncan Wyllie and Norman Norrish, the company principals, have a combined total of 80 years of experience in applied rock mechanics.

Since the 1990s, Wyllie and Norrish have been utilizing Palisade software to analyze rock fall risks and hazards, and design rock fall protection structures. They provide hazard mitigation services for rock falls in mountainous areas by identifying rock fall sources, modelling rock fall trajectories and energies, and designing customized protection structures. Projects have been undertaken for highways, railways, hydro-electric power plants, and residential developments.

Using Palisade Products

For most rock fall projects, very limited, or no, information is available from previous events. In these circumstances, uncertainty exists in the design parameters of rock fall frequency, mass, velocity and trajectory. These uncertainties can be quantified using @RISK to define probability distributions that account for the possible range of values, and the most likely values, based on judgement and experience.

Wyllie found that the BetaGeneral and Pert distributions incorporated in @RISK provide the optimum models for these conditions. Multiplication of mass by the square of the velocity distributions gives the impact energy that is also defined by a probability distribution. This information can be used to design protection structures that reduce the hazard to an acceptable level of societal risk.

Another component of risk management for rock fall projects is to implement decision analysis in which alternative courses of action, such as construction of a high strength containment fence or of a less expensive ditch, can be compared. This analysis can be carried out using PrecisionTree in which the sum of construction costs and expected value of an accident (i.e., the product of an accident cost and its probability) can be compared for each course of action. PrecisionTree allows rapid analysis of these alternatives and incorporates sensitivity analyses that show how uncertainty in values of the costs and probabilities influence the selection of the optimum action.

A particular value of analyses using @RISK and PrecisionTree is that it is possible to define low risk but high consequence events that have a low expected value, such as a large-scale landslide. Comparison of this event with more frequently occurring, but less costly rock falls will show if the optimum mitigation measure is to stabilize the landslide or contain the rock falls. The analyses often show that very rare events are acceptable.

"If we use @RISK and PrecisionTree to present results, people can make rational decisions as to what structural protection to install."Duncan Wyllie
Principal of Wyllie & Norrish Rock Engineer

Rock Fall Modeling Example

When a rock fall source on a steep mountain slope was identified above a road, Wyllie used @RISK to calculate the probability distribution of the potential rock fall impact energies that would be used to design a protection structure. Because the site had no history of rock falls that could be used for design, @RISK was used to analyze the geology and empirical equations for velocity to develop BetaGeneral distributions for the mass and velocity, from which the distribution for the impact energy was calculated. These plots are shown below where the maximum, minimum, and mean values for mass, velocity, and energy are indicated.

These results were discussed with the owner to determine an appropriate design energy capacity for a fence to protect the road from rock falls.

“Because access to the road was restricted and traffic was infrequent, it was decided that a rock fall event with high energy but low probability was acceptable,” said Wyllie. “The design energy of 1250 kJ was selected such that about 90% of falls would be contained, with the understanding that the low probability of an event with an energy exceeding 1250 kJ was acceptable.”

The image below shows the installed Attenuator fence.

In comparison, if the fence was installed above a busy Interstate highway where the consequence of a high energy rock fall event could be severe, it is likely that the design energy would be about 2500 kJ to 3000 kJ to ensure that almost all rock falls would be contained.

“If we use @RISK and PrecisionTree to present results, people can make rational decisions as to what structural protection to install,” said Wyllie.

Results

Thanks to the probabilistic analysis conducted by Wyllie, the road has a fence in place that can withstand an impact energy of 1250 kJ that will contain about 90% of future rock falls. Those traveling on the road can have peace of mind knowing the hazard mitigation structure was designed through quantitative analysis.

A Competitive Edge

The use of @RISK and PrecisionTree software to prepare designs where many or all of the design parameters are uncertain allows Wyllie to quantitatively determine the best mitigation strategy.

With Palisade’s products, Wyllie & Norrish Rock Engineers can:

By using deterministic analysis, Wyllie & Norrish Rock Engineers ensure that effective hazard mitigation structures are in place to protect people, facilities, and infrastructure.

Cynametrix, a financial analytics consulting firm based in Minnesota, uses the DecisionTools Suite to help them integrate life insurance products into general financial analysis. Traditionally it has been difficult to evaluate life insurance within a broad financial portfolio, however Palisade’s DecisionTools Suite solves this problem via Monte Carlo simulation and optimization to incorporate life insurance products along with all the other financial variables present in investment analysis. Thus, Cynametrix is able to demonstrate precisely what the financial advantages are of one decision over another to clients.

Background

Traditional financial analysis has focused heavily on various aspects of asset allocation decision and portfolio optimization. According to modern portfolio theory, adding an asset which has returns that are not tightly correlated to other assets in that portfolio can mitigate the risk of overall loss for that portfolio by reducing volatility. Thus, portfolio designers can optimize portfolios to yield either the highest expected return of a portfolio for a client’s given level of risk aversion, or the lowest level of risk for a client’s desired expected return.

Life insurance, or Insured Death Benefit (IDB) is a useful class for optimizing portfolios because its rate of return is driven primarily by when death occurs, rather than fluctuations in the capital markets, which means that IDB returns have a low correlation to those of other asset classes that are dependent on the stock market’s performance.

This idea is important not just for individuals and the assets they leave their families, but also for multi-life situations such as high-net worth families who pool their financial management, large charities such as colleges that have an insurable interest in their major donors, and businesses who must fund executive benefits such as deferred compensation plans.

Until recently, most analysts and scholars overlooked IDB as a usable product for a traditional investment portfolio. This was because IDB is difficult to compare to other traditional assets like securities or stocks, as IDB can manifest in two separate ways—either as cash value— the investment element of a policy that is paid to the owner if the policy is surrendered prior to death, or as the death benefit, which is the amount paid to the policy's beneficiary at the insured's death. “The problem is, when you evaluate an IDB it’s hard to compare it to securities,” says Robert Danielsen, CEO of Cynametrix. “What are you going to evaluate? The cash build-up or death benefit? Historically, insurance people have talked about the two characteristics separately.”

Optimizing portfolios with life insurance also presents a problem. Danielsen explains that, for portfolios with traditional financial products, analysts use the annual expected mean return and variance of an asset class to optimize the return or minimize portfolio instability. However, with life insurance products “there’s only one point of return—that’s the death of the insured, and we don’t know when that is—so how do you integrate that?”

A New Approach to Life Insurance

Danielsen tackled these problems with IDB by creating a proprietary mathematical process known as MAVsm(which stands for mortality adjusted value) The process generates values and indexes for valuing the cash position of any life insurance product or strategy, incorporating mortality probability. Essentially, Danielsen conceived of the idea that, at any point in time the future cash position of an IDB can only be one of three things:

Thus, Danielson derives the following formula:

Where:
Using his proprietary MAVsm technique, Danielsen uses Monte Carlo simulation to determine which of these three options the IDB will be for any given year in the future. “The process necessarily requires Monte Carlo simulation—I don’t believe what we do with MAVsm can be done with a static formula—it’s all probability-based,” says Danielsen. “It’s a perfect example of what the Palisade products can do.”

Because there is no way to know for certain when the insured person will die, Danielsen incorporates mortality probability data into @RISK, and runs the model several thousand times. The MAVsm method also incorporates myriad other variables, including product issues, investment decision, and tax attributes—all of which requires a workbook with 12 worksheets with over 250,000 active cells to take all these variables into account. “There’s no way the human brain can see the relationships among all these variables, all of which are probabilistic,” says Danielsen. “That’s why Monte Carlo simulation is so necessary for this work.”

For example, Figure 1 is a histogram of the results of a Monte Carlo simulation of the age at death for a 54-year old male non-smoker, using standard actuarial tables with certain adjustments. Because the age of death determines the MAVsm value at any point in the future, the probability of death at each age, as demonstrated in this simulation, can be combined with the other variables of the model to determine the statistical properties of the MAVsm value at any point in the future. These simulation results show that by year 20, for example, the probability is approximately 12 percent that the insured has died and that the death benefit has been received. It also shows that there is a 1% chance of the person died that year. Finally, the likelihood that he has not died yet, (and that life insurance still exists as its cash value), is 88%.

Figure 1: Mortality Probability by Age at Death (Male, age 54, nonsmoker): Graph shows probability of death in Year 20 is only ~1%, and the probability of death by Year 20 is ~12%

Along with this IDB data, the final result of Danielsen’s MAVsm model also includes a distribution of thousands of possibilities for the securities portfolio for each year of the strategy. Using this projection of probable values allows Danielsen to see the overall risk-adjusted returns of the portfolio.

With this information in-hand, Danielsen can then test the effect of different allocations of the portfolio to IDB to determine whether IDB is improving the probable risk-adjusted returns of the entire portfolio. Using RISKOptimizer, he also determines the optimal allocation of the portfolio to IDB, as well as the optimal allocation among a number of potential insureds in a multi-life case.

This approach to incorporating life insurance into financial portfolios is unprecedented, says Danielsen. “No one else is doing this, to the best of my knowledge,” he says. “When we show it to clients, they say they’ve never seen anything like this.” The alternative approach only compares a selected number of scenarios, based on static variable assumptions. “There is no way to evaluate the relative probability of the selected scenarios, much less combine the probabilities of the many variables in order to see the interdependent effects among the variables,” says Danielsen. “This made much of the planning a shot in the dark.”

Thanks to Palisade software, Danielsen has been able to make planning as precise as possible. “Palisade products have really enhanced these optimizing methods for me, and helped me create a much more sophisticated product.”

As the public becomes more aware of environmental issues and global warming, consumers are asking more questions about the products they are buying, with the result that a growing number of companies are considering the move to green manufacturing processes and supply chains. However, 'going green' can be a complex transition from an operational perspective, and not all companies are convinced that improved environmental performance will lead to financial gains. "Many companies are concerned that changing their established processes and implementing a green supply chain could result in lower quality products, delayed shipments, or even a loss of business," explained Dr. Sachin K. Mangla of the Indian Institute of Technology Roorkee and Graphic Era University - Dehradun.

A company's supply chain includes everything from purchasing, planning and managing the use of materials, to shipping and distributing the final product. Any delay or disruption can cause the supply chain to break down, which in turn can have a catastrophic impact on the business. This makes a company's decision to transition to a green supply chain (GSC) daunting, as it needs to identify, understand and evaluate all potential risks and consequences before 'going green' at an operational level. "Supply chain management is a key area of focus for businesses wanting to modernize and optimize their profits," said Mangla. "We wanted to build a risk analysis model that would help the increasing number of manufacturing companies that might be considering the move to a green supply chain."

With its headquarters in Sendai City in Miyagi Prefecture, Hitachi Solutions East Japan Ltd. is responsible for data and risk analysis of supply chain management, production, sales and inventory planning and management. They are also responsible for research and development of analytical methods and the creation of new products (including new packages and cloud-based services). For Hitachi, risk management means the analysis and repetition of potential risk factors, identification of risk, prioritization, and follow up action and monitoring.

There are a wide variety of risks in the manufacturing industry, including fluctuating variables such as demand, price, and foreign exchange. For demand fluctuation, for example, the amount of supply (production and procurement) is decided based on demand forecasts. If the forecast is wrong and there is a supply shortage, then the company will miss opportunities to increase sales. If there is too much supply, then there are inventory or disposal costs. Price and currency fluctuations are also important factors for risk management in the manufacturing sector, and the improvement of forecast accuracy through risk analysis is essential to mitigate these risks.

"@RISK enabled us to not only predict the type of risks that could happen, but also anticipate what risks were most likely to happen. This enabled us to create a model that provides companies with visibility into the potential ecological-economic gains of a green supply chain, as well as recommendations to best manage the operational risks."

Dr. Sachin K. Mangla
Indian Institute of Technology Roorkee and Graphic Era University

A Multi-Stage Process

To create the green supply chain (GSC) model, Mangla and the team interviewed several senior managers, IT managers and supply chain professionals within plastic manufacturing to identify operational risks that could disrupt the supply chain. These risks included machine, equipment or facility failure, process design problems, lack of skilled labour, system or software failure, and green technology inadequacy. After generating time-based effects for each of these risks to help measure the potential delays and disruptions to the GSC, the team identified the consequences for each of these risks: delays in production (time), materials and labor (cost), brand impact, health and safety, and product quality.

Mangla and the team knew they would need to run Monte Carlo simulation to address the wide range of uncertainties associated with these risks and consequences. The team analysed the data using Palisade's @RISK software, using Triangular distributions to determine the minimum, maximum and 'most likely' potential disruptions to the GSC. "We also found @RISK's sensitivity analysis feature extremely useful, as it provided an easy and efficient way to determine the average potential disruption of materials and products for each of the identified risks," said Mangla.

The results generated a wide variety of possible risk scenarios, as well as associated probabilities. "@RISK enabled us to not only predict the type of risks that could happen, but also anticipate what risks were most likely to happen," said Mangla. "This enabled us to create a model that provides companies with visibility into the potential ecological-economic gains of a green supply chain, as well as recommendations to best manage the operational risks."

The world of financial markets and investments is rife with risk and reward. This is particularly true for hedge funds, sophisticated investment vehicles that are typically used only by experienced investors and firms. They are named for the fact that investors ‘hedge,’ or attempt to protect their funds against volatile swings in the markets. But how are these portfolios managed? And are managers protecting and optimizing these funds as well as they should be? Seth Berlin, Principal Strategist at Performance Thinking & Technologies, a company that helps asset managers with investment operations, recently tackled these questions when working with a hedge fund client. With the help of @RISK and RISKOptimizer software, Berlin was able to create a quantitative model that uses simulation to optimize a hedge fund’s portfolio.

Finding the Sweet Spot

When managing a hedge fund portfolio, Berlin says, there has to be a balance between maximizing profit and minimizing risk. Modern portfolio management is based on the capital asset pricing model (CAPM). In CAPM, “You theoretically can maximize return by maximizing risk.” While taking on high risk can lead to high returns over the long term, “over the short term you would have large swings in your portfolio,” says Berlin. “Everyone can maximize profits but in the end you’re dead,” meaning some level of short-term stability is necessary.

The key, Berlin says, is to find an optimal portfolio that “maximizes return with the lowest incremental change in risk,” a sweet spot that avoids exorbitant risk while still yielding satisfactory returns. In his model, Berlin was simulating changes in capital markets assumptions and adjusting a mix of hedges to identify an optimal portfolio.

The Long and the Short of It

Berlin’s client dealt with a hedge fund portfolio comprising both long and short positions. Long positions are assets that are owned by the fund, such as stocks, which investors hope will increase in value. Short positions are ‘borrowed assets’ sold short, which increase in value if the price of the underlying assets goes down. Hedge funds use combinations of long and short positions to protect against equity market movements, currency rate changes, interest rate risk, and credit worthiness. “I’ve worked with many hedge fund managers on how they manage hedges. Anytime you can move to more quantitative than qualitative management based on simulation and optimization, it’s a win for both managers and investors.”

"If I have fifty things I want to allocate money to, and I have choices on how to allocate it, you never know the right mix. RISKOptimizer and @RISK shuffle the deck a million times, and let you see how all those different combinations could play out."

Seth Berlin
Principal Strategist, Performance Thinking & Technologies

Turning an Art into Science

Berlin used @RISK and RISKOptimizer to prove to his clients that quantitative measures of simulation could help identify an optimal portfolio, rather than the typical qualitative approach. He used a step-wise process to develop his quantitative model:

Simulations and Smarts

After going through this five-step process, Berlin had results that indicated how the hedge fund managers could change their profitability by altering the mix of long and short positions in the portfolio. “RISKOptimizer gave me an optimal mix of my short positions for the next week. Since this is theoretical, you then compare your results with what really happened in the next week,” he says.

These results are helpful, but of course, they are not the only answer. “I’m not an alchemist. I’m not coming up with a secret sauce that is the model for managers to make money,” says Berlin. “What I was trying to do overall is to move from a qualitative to a quantitative way to judge a problem.” He explains that managing a hedge fund will always require qualitative decisions-- “that’s what experience is”--but that well done simulations provide a method to test one’s intuition and judgment.

An Underused Tool

This method Berlin used has broad applications, and could be used by anyone hedging a portfolio. Yet the majority of investors don’t apply such rigorous tests. “I think everyone would say they’re doing some kind of modeling, but for the most part it doesn’t involve simulation and optimization.” Berlin believes turning to a tool like RISKOptimizer can give some certainty and structure to the process. “Based on your capital market inputs, RISKOptimizer helps you look at different mixes of assets to reach a better risk-return structure,” he says. “If I have fifty things I want to allocate money to, and I have choices on how to allocate it, you never know the right mix. RISKOptimizer and @RISK shuffle the deck a million times, and let you see how all those different combinations could play out.”

With Enterprise Resource Planning a nearly universal but continually evolving business model, consultant Don Mettler fulfills a vital role for the large manufacturing companies that are his clients. He specializes in integrating software systems to help these companies make the most efficient use of their plant and material resources. For the past fifteen years he has worked with companies in businesses that range from pharmaceuticals to aerospace and automotive to high-tech hardware--and Palisade’s Evolver, @RISK, and the @RISK Developer Kit have been essential to much of this work.

To create the models that guide company efficiencies, Mettler needs to link the company’s backend database, business intelligence software, and, sometimes, legacy systems to Palisade’s tools. His goal in developing a custom system is to build an optimization model that he can eventually turn over to his client to manage. “That’s one reason I like to use the Palisade tools. They run in Excel, and clients can easily understand the results.”

A Typical Manufacturing Example

For one recent assignment, Mettler developed a model to optimize the production and minimize inventory costs of corrugated cardboard boxes for fresh produce. Demand for the product was highly volatile, while plant capacity and the number of manufacturing machines were fixed. Ideally, inventory levels would respond to ordering information, and his model would have to account for thousands of inputs from the company’s ERP and business intelligence software.

In response to these challenges, Mettler first built a prototype model using RISKOptimizer and later used @RISK and Evolver to build the final model installed on his client’s system. He used @RISK to forecast sales data and Evolver to optimize the model outputs for production. The optimization results were stored in the company’s database and could be re-accessed by his model during a simulation run.

"Every manufacturing company should use scheduling optimization. You have a finite set of resources, and you want to maximize gain while minimizing the cost of handling those resources. It’s a problem just made for genetic programming and Evolver."

Don Mettler
Consultant

Real-Time Optimization for Real-Time Decisions

Mettler’s model ran--and under the supervision of plant managers is still running--every night on a fifty-two week schedule, continually adjusting box production and inventory to daily orders. The company is manufacturing only boxes it will ship that week and has reduced the costs associated with inventory.

“Every manufacturing company should use scheduling optimization,” says Mettler. “You have a finite set of resources, and you want to maximize gain while minimizing the cost of handling those resources. It’s a problem just made for genetic programming and Evolver.”

Metallurgical giant Met-Mex Peñoles uses the DecisionTools Suite for Six Sigma Design of Experiments. Because silver and gold are so expensive, process optimization allows analysts to test innovations, avoiding costly trial runs.

DecisionTools Suite in Six Sigma Design of Experiments

Because of the costliness of its raw materials, the metallurgical giant Met-Mex Peñoles, the world’s largest refiner of silver and Mexico’s largest refiner of gold, tries to avoid expensive pilot projects. To cut down on the number of trial runs, the company simulates the refining process by using the DecisionTools Suite in Six Sigma Design of Experiments. This allows the company to work on process optimization and sacrifice a minimum of gold and silver to its experiments.

According to Ignacio Quijas, technology manager for Peñoles, “When you are working with silver and gold, pilot projects to test innovations in the manufacturing process are very costly—and risky. Using @RISK to simulate changes in process design allows us to answer some difficult questions without actually running trials of the process.”

To offer some perspective on the exacting standards Peñoles must meet, Ignacio points out that, for instance, a 100-ounce silver bar must weigh at least 100 ounces—however, the price of the bar does not increase if the bar weighs slightly more than the specification. The additional silver is simply passed along free to the customer and is a production cost.

Each step in the manufacturing processes for gold and silver value-added products creates room for additional error, and the way Peñoles optimizes its process is to reduce the variability of the errors across the manufacturing steps. To build its Six Sigma simulation, Peñoles inputs the physical measurements of the errors and also feeds into the model the specifications and tolerances of its manufacturing equipment, different physical operations, random processing errors, and cost analyses that are pinpoint precise. “We are measuring the amount of gold an silver that turns up when we do every operation and can result in loses,” Ignacio reports.

"When you are working with silver and gold, pilot projects to test innovations in the manufacturing process are very costly—and risky. Using @RISK to simulate changes in process design allows us to answer some difficult questions without actually running trials of the process."Ignacio Quijas
Technology Manager, Met-Mex Peñoles

His primary simulation tool is @RISK. “The functionality gives you so much more vision.” But because of the need for precision in his simulation, Ignacio also makes extensive use of @RISK distribution fitting and TopRank. He likes their capacities for graphic representation, which he uses to explain the intricacies of his simulations to colleagues.

“The ability to communicate aspects of the simulation is important,” he says, “because these very detailed models serve the same function as early trial runs.”

Yes, he said, Peñoles does still rely on pilot projects, but only after DecisionTools has accounted for every speck of silver and gold that might result in the production losses or recycled materials that increase cost.

magnifierarrow-right
linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram