To address complicated debt portfolio and capital investment problems, Dr. Emmanuel Donkor uses Lumivero software @RISK and RISKOptimizer. His research has led to new and better methods for addressing financial statistical problems, using @RISK.

When multiple sources of debt with different financing terms are available to the decision maker–for example, a ten-year loan with 5% interest, and a 5-year loan with 10% interest–the environment is fraught with uncertainty. “The challenge is to determine an appropriate or optimal mix of these different debt sources that simultaneously maximizes project value and reduces the risk of default,” explains Dr. Emmanuel Donkor, a quantitative analyst and faculty member of the School of Engineering and Applied Sciences (SEAS) at George Washington University. To address complicated debt portfolio and capital investment problems, Donkor uses Lumivero software, previously Palisade, @RISK and RISKOptimizer. His research has led to new and better methods for addressing financial statistical problems, using @RISK.

Dr. Donkor used Lumivero software tools @RISK and RISKOptimizer to conduct two separate research projects published in the journal the Engineering Economist. The first tackled the improvement of debt portfolios for financing capital investment plans, while the other project empirically tested stochastic dominance relationships in risky capital investments.

Diversifying Debt

In his first project published in the Engineering Economist, Donkor, along with Associate Professor Michael Duffey, addressed the question of how project promoters, who wish to use project finance as a procurement vehicle, can choose an optimal debt portfolio. The aim for these promoters is to develop a portfolio that maximizes project value but minimizes default risk when project cash flows are uncertain, and when debt with different financing terms can be sourced from multiple sources. For example, the promoter may face the previously-mentioned situation of multiple sources of debt with different financing terms—such as a ten-year loan with 5% interest, and a 5-year loan with 10% interest. “It is a difficult problem because the environment is fraught with uncertainty,” says Dr. Donkor.

To address this decision problem, Dr. Donkor and Dr. Duffey used @RISK and RISKOptimizer to help analyze and then recommend an appropriate mix of different debt instruments for financing a capital investment project. “RISKOptimizer allows the analyst to incorporate what’s known as probabilistic or chance constraints—this ensures that the risk of defaulting on loan payments, in any given period of the loan tenure is limited to say 5%,” says Dr. Donkor. They first developed a stochastic financial model in Excel, and used RISKOptimizer’s simulation optimization capability to select an optimal mix of fixed-rate debt instruments such that default occurred no more than 5% of the time. They then used @RISK simulation to evaluate the performance of the debt policy prescribed by the optimization model.

This new approach improves on the traditional methods used in risk analysis in capital investment planning, in which analysts would “take the value of debt as given, and simulate it without taking into consideration the probability of default on debt service each year,” says Dr. Donkor. “Furthermore, analysts do not consider multiple sources of debt with different financing terms—you hardly ever see debt portfolios with different terms and interest rates—it’s usually homogenous, with debt having one tenure and one interest rate.” Dr. Donkor and Dr. Duffey’s new approach, which shows how to model chance constraints in a spreadsheet environment, and implement it with RISKOptimizer, has sparked interest in the field. The paper detailing their work has become one of the highest ranked articles in the Engineering Economist.

Figure 1: The profile of excess earnings after debt service, indicating that the chances of defaulting on debt service are well contained.

Figure 2: Plot of excess earnings after interest coverage.

"RISKOptimizer allows the analyst to incorporate what’s known as probabilistic or chance constraints. This ensures that the risk of defaulting on loan payments, in any given period of the loan tenure is limited."Dr. Emmanuel Donkor
School of Engineering and Applied Sciences (SEAS), George Washington University

Ranking Opportunities

Dr. Donkor also addressed the problem that analysts face when they must choose between multiple, risky, mutually exclusive capital investments. He did this by creating a spreadsheet framework that uses @RISK to implement empirical tests of stochastic dominance—a term used in decision theory which describes the ranking of random prospects based on preferences regarding outcomes. As a result of Dr. Donkor’s work, analysts involved in comparing risky capital investments do not have to rely on qualitative and visually-based ‘best guesses’.

This solution benefits those who are faced with investment choices in which only one option can be selected. For example, a person owns a building, and has to decide whether to rent it out as residential apartments or as a factory. “You can’t do both at the same time,” says Dr. Donkor, “so you have to choose one option.”

Typically these kinds of opportunities are compared by using decision rules based on the mean-variance criterion (selecting portfolios based on the means and variances of their returns) or safety-first criterion (setting a minimum required return for a given level of risk). However, at times, the mean variance criterion and its variants result in an efficiency frontier in which more than one investment option offers maximal expected return for some given level of risk, and minimal risk for some given level of expected return. This can make it difficult to select only one option.

“The problem becomes complicated when you have opportunity A, which gives you the highest value, but it has a high risk, and opportunity B, which will give you lower value but a lower risk,” says Donkor. “As a decision maker, you want high value and low risk, but these qualities are not always enshrined in the same opportunity.” For such problems, stochastic dominance rules, typically implemented by visually inspecting the cumulative distribution functions (CDF) of the alternatives, are applied. However, for many practical applications, it is common for the distributions to cross tails, creating what’s known as the ‘tail problem’. In these circumstances, analysts apply what’s known as ‘almost stochastic dominance’ (ASD), which allows decision makers to ignore the crossing at the tails so that dominance inferences can be made.

These approaches are inexact and lack quantitative certainty; on top of these issues, Dr. Donkor says that most analysts do capital planning problems in Microsoft Excel, but are not able to make stochastic dominance inferences in that program. “A theory has been developed, but no one has turned that theory into a code in Excel where it can be used,” says Dr. Donkor. Thus, the majority of practitioners, researchers, and students who analyze alternative capital investment plans under uncertainty in Excel are limited to using either visual inspection or ASD without any empirical support.

Dr. Donkor has improved this process with his @RISK-enabled spreadsheet framework which empirically tests stochastic dominance. Now, instead of using a visual best guess, analysts can use an @RISK model to empirically test for the best option among many, allowing them to make empirically defensible decisions when comparing risky capital investments.

Originally published: June 16, 2021
Updated: June 7, 2024

To address complicated debt portfolio and capital investment problems, Dr. Emmanuel Donkor uses Lumivero software @RISK and RISKOptimizer. His research has led to new and better methods for addressing financial statistical problems, using @RISK.

When multiple sources of debt with different financing terms are available to the decision maker–for example, a ten-year loan with 5% interest, and a 5-year loan with 10% interest–the environment is fraught with uncertainty. “The challenge is to determine an appropriate or optimal mix of these different debt sources that simultaneously maximizes project value and reduces the risk of default,” explains Dr. Emmanuel Donkor, a quantitative analyst and faculty member of the School of Engineering and Applied Sciences (SEAS) at George Washington University. To address complicated debt portfolio and capital investment problems, Donkor uses Lumivero software, previously Palisade, @RISK and RISKOptimizer. His research has led to new and better methods for addressing financial statistical problems, using @RISK.

Dr. Donkor used software tools @RISK and RISKOptimizer to conduct two separate research projects published in the journal the Engineering Economist. The first tackled the improvement of debt portfolios for financing capital investment plans, while the other project empirically tested stochastic dominance relationships in risky capital investments.

Diversifying Debt

In his first project published in the Engineering Economist, Donkor, along with Associate Professor Michael Duffey, addressed the question of how project promoters, who wish to use project finance as a procurement vehicle, can choose an optimal debt portfolio. The aim for these promoters is to develop a portfolio that maximizes project value but minimizes default risk when project cash flows are uncertain, and when debt with different financing terms can be sourced from multiple sources. For example, the promoter may face the previously-mentioned situation of multiple sources of debt with different financing terms—such as a ten-year loan with 5% interest, and a 5-year loan with 10% interest. “It is a difficult problem because the environment is fraught with uncertainty,” says Dr. Donkor.

To address this decision problem, Dr. Donkor and Dr. Duffey used @RISK and RISKOptimizer to help analyze and then recommend an appropriate mix of different debt instruments for financing a capital investment project. “RISKOptimizer allows the analyst to incorporate what’s known as probabilistic or chance constraints—this ensures that the risk of defaulting on loan payments, in any given period of the loan tenure is limited to say 5%,” says Dr. Donkor. They first developed a stochastic financial model in Excel, and used RISKOptimizer’s simulation optimization capability to select an optimal mix of fixed-rate debt instruments such that default occurred no more than 5% of the time. They then used @RISK simulation to evaluate the performance of the debt policy prescribed by the optimization model.

This new approach improves on the traditional methods used in risk analysis in capital investment planning, in which analysts would “take the value of debt as given, and simulate it without taking into consideration the probability of default on debt service each year,” says Dr. Donkor. “Furthermore, analysts do not consider multiple sources of debt with different financing terms—you hardly ever see debt portfolios with different terms and interest rates—it’s usually homogenous, with debt having one tenure and one interest rate.” Dr. Donkor and Dr. Duffey’s new approach, which shows how to model chance constraints in a spreadsheet environment, and implement it with RISKOptimizer, has sparked interest in the field. The paper detailing their work has become one of the highest ranked articles in the Engineering Economist.

Figure 1: The profile of excess earnings after debt service, indicating that the chances of defaulting on debt service are well contained.

Figure 2: Plot of excess earnings after interest coverage.

"RISKOptimizer allows the analyst to incorporate what’s known as probabilistic or chance constraints. This ensures that the risk of defaulting on loan payments, in any given period of the loan tenure is limited."Dr. Emmanuel Donkor
School of Engineering and Applied Sciences (SEAS), George Washington University

Ranking Opportunities

Dr. Donkor also addressed the problem that analysts face when they must choose between multiple, risky, mutually exclusive capital investments. He did this by creating a spreadsheet framework that uses @RISK to implement empirical tests of stochastic dominance—a term used in decision theory which describes the ranking of random prospects based on preferences regarding outcomes. As a result of Dr. Donkor’s work, analysts involved in comparing risky capital investments do not have to rely on qualitative and visually-based ‘best guesses’.

This solution benefits those who are faced with investment choices in which only one option can be selected. For example, a person owns a building, and has to decide whether to rent it out as residential apartments or as a factory. “You can’t do both at the same time,” says Dr. Donkor, “so you have to choose one option.”

Typically these kinds of opportunities are compared by using decision rules based on the mean-variance criterion (selecting portfolios based on the means and variances of their returns) or safety-first criterion (setting a minimum required return for a given level of risk). However, at times, the mean variance criterion and its variants result in an efficiency frontier in which more than one investment option offers maximal expected return for some given level of risk, and minimal risk for some given level of expected return. This can make it difficult to select only one option.

“The problem becomes complicated when you have opportunity A, which gives you the highest value, but it has a high risk, and opportunity B, which will give you lower value but a lower risk,” says Donkor. “As a decision maker, you want high value and low risk, but these qualities are not always enshrined in the same opportunity.” For such problems, stochastic dominance rules, typically implemented by visually inspecting the cumulative distribution functions (CDF) of the alternatives, are applied. However, for many practical applications, it is common for the distributions to cross tails, creating what’s known as the ‘tail problem’. In these circumstances, analysts apply what’s known as ‘almost stochastic dominance’ (ASD), which allows decision makers to ignore the crossing at the tails so that dominance inferences can be made.

These approaches are inexact and lack quantitative certainty; on top of these issues, Dr. Donkor says that most analysts do capital planning problems in Microsoft Excel, but are not able to make stochastic dominance inferences in that program. “A theory has been developed, but no one has turned that theory into a code in Excel where it can be used,” says Dr. Donkor. Thus, the majority of practitioners, researchers, and students who analyze alternative capital investment plans under uncertainty in Excel are limited to using either visual inspection or ASD without any empirical support.

Dr. Donkor has improved this process with his @RISK-enabled spreadsheet framework which empirically tests stochastic dominance. Now, instead of using a visual best guess, analysts can use an @RISK model to empirically test for the best option among many, allowing them to make empirically defensible decisions when comparing risky capital investments.

Originally published: June 16, 2021
Updated: June 7, 2024

Application

Modeling the frequency and magnitude of future debris flows to determine the optimum hazard mitigation strategy. Communicating risk to clients by displaying the probability of event paths for three decisions:

  1. Existing conditions
  2. Constructing a containment dam
  3. Relocating existing residences

Summary

Duncan Wyllie, a Principal of Wyllie & Norrish Rock Engineers, uses the Palisade software PrecisionTree for probabilistic modeling of debris flow protection measures.

When analyzing the optimum method of protecting an area at risk from debris flows, three decisions are compared – accepting existing conditions, constructing a containment dam with sufficient capacity to contain future flows, or relocating residences on the debris flow runout area. Creating probabilistic decision trees in PrecisionTree allows uncertainties in the frequency and magnitude of future debris flows to be analyzed, and for comparison of costs between constructing a dam and relocating the residences.

Background

Wyllie & Norrish Rock Engineers, with offices in Seattle and Vancouver, Canada, is a specialist engineering company working in the fields of landslides, tunnels, slopes, and foundations. Duncan Wyllie and Norman Norrish, the company principals, have a combined total of 80 years of experience in applied rock mechanics.

Since the 1990s, Wyllie and Norrish have been utilizing Palisade software to analyze natural hazards and select hazard mitigation procedures.

Using Palisade Products

When a potential debris flow hazard is located above a residential development, PrecisionTree can be used to create a probabilistic decision tree that maps out possible scenarios, the likelihood they will occur, and the estimated damage costs. Three decisions are compared – existing conditions, constructing a debris flow dam, or evacuating the debris flow runout area.

ss

"If we use @RISK and PrecisionTree to present results, people can make rational decisions as to what structural protection to install."

Duncan Wyllie
Principal of Wyllie & Norrish Rock Engineer

ss

Debris Flow Dam Decision Tree Example

With reference to the decision tree shown below, the components of the analysis are as follows:

For a closer look, download our free Debris Flow Containment Dam example model.

Analysis shows that the optimum decision is to construct a containment dam because the total cost of mitigation plus the expected cost (EV) of damage is lower for the dam construction (EVΣdam = $200,150) than for existing conditions (EVΣexisting = $360,000) or for relocating the houses (EVΣhouses = $2,000,600).

Results

The use of PrecisionTree allows possible mitigation measures, along with the probability of event occurrence and cost, to be analyzed. The analysis unambiguously identifies the most cost-effective mitigation measure, and the decision process is clearly mapped out in the decision tree.

A Competitive Edge

The use of @RISK and PrecisionTree software to prepare decision trees modeling all potential outcomes enables Wyllie & Norrish Rock Engineers to quantitatively determine the optimum protection strategy and easily communicate the findings.

With Palisade’s products, Wyllie & Norrish Rock Engineers can:

By using probabilistic analysis, Wyllie & Norrish Rock Engineers ensure that the best decision is reached for each at-risk area and if necessary, effective debris flow dams are created to protect nearby structures.

Free Example Models

Download our free example model, Decision Trees in Geotechnical Engineering, to explore three decision tree examples from the geotechnical engineering field: debris flow containment dam, rock slope stabilization, and gravity dam reinforcement anchors.

Application

Modeling rock fall masses, trajectories, velocities, and energies to design rock fall protection structures. Communicating risk to clients by calculating impact energy probability so that the impact capacity of the protection structure can be matched to the consequence of an accident.

Summary

Duncan Wyllie, a Principal of Wyllie & Norrish Rock Engineers, uses @RISK and PrecisionTree for probabilistic modeling of rock fall hazards. The analyses incorporate uncertainty in values of the mass and velocity that are expressed as probability distributions, and Monte Carlo simulation in @RISK is used to calculate the probability distribution of the impact energy. The energy calculations are then used to design protection structures such as concrete sheds and wire rope fences with the appropriate impact energy capacity to suit the possible consequences of an accident. Decision analysis using PrecisionTree is then applied to determine the optimum mitigation strategy.

In the example, Wyllie explains how Palisade software is used to calculate potential rock fall impact energies and make recommendations for a fence to protect traffic at the base of a steep mountain slope.

Background

Wyllie & Norrish Rock Engineers, with offices in Seattle and Vancouver, Canada, is a specialist engineering company working in the fields of rock slopes, tunnels, blasting, foundations, landslides, and rock falls. Duncan Wyllie and Norman Norrish, the company principals, have a combined total of 80 years of experience in applied rock mechanics.

Since the 1990s, Wyllie and Norrish have been utilizing Palisade software to analyze rock fall risks and hazards, and design rock fall protection structures. They provide hazard mitigation services for rock falls in mountainous areas by identifying rock fall sources, modelling rock fall trajectories and energies, and designing customized protection structures. Projects have been undertaken for highways, railways, hydro-electric power plants, and residential developments.

Using Palisade Products

For most rock fall projects, very limited, or no, information is available from previous events. In these circumstances, uncertainty exists in the design parameters of rock fall frequency, mass, velocity and trajectory. These uncertainties can be quantified using @RISK to define probability distributions that account for the possible range of values, and the most likely values, based on judgement and experience.

Wyllie found that the BetaGeneral and Pert distributions incorporated in @RISK provide the optimum models for these conditions. Multiplication of mass by the square of the velocity distributions gives the impact energy that is also defined by a probability distribution. This information can be used to design protection structures that reduce the hazard to an acceptable level of societal risk.

Another component of risk management for rock fall projects is to implement decision analysis in which alternative courses of action, such as construction of a high strength containment fence or of a less expensive ditch, can be compared. This analysis can be carried out using PrecisionTree in which the sum of construction costs and expected value of an accident (i.e., the product of an accident cost and its probability) can be compared for each course of action. PrecisionTree allows rapid analysis of these alternatives and incorporates sensitivity analyses that show how uncertainty in values of the costs and probabilities influence the selection of the optimum action.

A particular value of analyses using @RISK and PrecisionTree is that it is possible to define low risk but high consequence events that have a low expected value, such as a large-scale landslide. Comparison of this event with more frequently occurring, but less costly rock falls will show if the optimum mitigation measure is to stabilize the landslide or contain the rock falls. The analyses often show that very rare events are acceptable.

"If we use @RISK and PrecisionTree to present results, people can make rational decisions as to what structural protection to install."Duncan Wyllie
Principal of Wyllie & Norrish Rock Engineer

Rock Fall Modeling Example

When a rock fall source on a steep mountain slope was identified above a road, Wyllie used @RISK to calculate the probability distribution of the potential rock fall impact energies that would be used to design a protection structure. Because the site had no history of rock falls that could be used for design, @RISK was used to analyze the geology and empirical equations for velocity to develop BetaGeneral distributions for the mass and velocity, from which the distribution for the impact energy was calculated. These plots are shown below where the maximum, minimum, and mean values for mass, velocity, and energy are indicated.

These results were discussed with the owner to determine an appropriate design energy capacity for a fence to protect the road from rock falls.

“Because access to the road was restricted and traffic was infrequent, it was decided that a rock fall event with high energy but low probability was acceptable,” said Wyllie. “The design energy of 1250 kJ was selected such that about 90% of falls would be contained, with the understanding that the low probability of an event with an energy exceeding 1250 kJ was acceptable.”

The image below shows the installed Attenuator fence.

In comparison, if the fence was installed above a busy Interstate highway where the consequence of a high energy rock fall event could be severe, it is likely that the design energy would be about 2500 kJ to 3000 kJ to ensure that almost all rock falls would be contained.

“If we use @RISK and PrecisionTree to present results, people can make rational decisions as to what structural protection to install,” said Wyllie.

Results

Thanks to the probabilistic analysis conducted by Wyllie, the road has a fence in place that can withstand an impact energy of 1250 kJ that will contain about 90% of future rock falls. Those traveling on the road can have peace of mind knowing the hazard mitigation structure was designed through quantitative analysis.

A Competitive Edge

The use of @RISK and PrecisionTree software to prepare designs where many or all of the design parameters are uncertain allows Wyllie to quantitatively determine the best mitigation strategy.

With Palisade’s products, Wyllie & Norrish Rock Engineers can:

By using deterministic analysis, Wyllie & Norrish Rock Engineers ensure that effective hazard mitigation structures are in place to protect people, facilities, and infrastructure.

Cynametrix, a financial analytics consulting firm based in Minnesota, uses the DecisionTools Suite to help them integrate life insurance products into general financial analysis. Traditionally it has been difficult to evaluate life insurance within a broad financial portfolio, however Palisade’s DecisionTools Suite solves this problem via Monte Carlo simulation and optimization to incorporate life insurance products along with all the other financial variables present in investment analysis. Thus, Cynametrix is able to demonstrate precisely what the financial advantages are of one decision over another to clients.

Background

Traditional financial analysis has focused heavily on various aspects of asset allocation decision and portfolio optimization. According to modern portfolio theory, adding an asset which has returns that are not tightly correlated to other assets in that portfolio can mitigate the risk of overall loss for that portfolio by reducing volatility. Thus, portfolio designers can optimize portfolios to yield either the highest expected return of a portfolio for a client’s given level of risk aversion, or the lowest level of risk for a client’s desired expected return.

Life insurance, or Insured Death Benefit (IDB) is a useful class for optimizing portfolios because its rate of return is driven primarily by when death occurs, rather than fluctuations in the capital markets, which means that IDB returns have a low correlation to those of other asset classes that are dependent on the stock market’s performance.

This idea is important not just for individuals and the assets they leave their families, but also for multi-life situations such as high-net worth families who pool their financial management, large charities such as colleges that have an insurable interest in their major donors, and businesses who must fund executive benefits such as deferred compensation plans.

Until recently, most analysts and scholars overlooked IDB as a usable product for a traditional investment portfolio. This was because IDB is difficult to compare to other traditional assets like securities or stocks, as IDB can manifest in two separate ways—either as cash value— the investment element of a policy that is paid to the owner if the policy is surrendered prior to death, or as the death benefit, which is the amount paid to the policy's beneficiary at the insured's death. “The problem is, when you evaluate an IDB it’s hard to compare it to securities,” says Robert Danielsen, CEO of Cynametrix. “What are you going to evaluate? The cash build-up or death benefit? Historically, insurance people have talked about the two characteristics separately.”

Optimizing portfolios with life insurance also presents a problem. Danielsen explains that, for portfolios with traditional financial products, analysts use the annual expected mean return and variance of an asset class to optimize the return or minimize portfolio instability. However, with life insurance products “there’s only one point of return—that’s the death of the insured, and we don’t know when that is—so how do you integrate that?”

A New Approach to Life Insurance

Danielsen tackled these problems with IDB by creating a proprietary mathematical process known as MAVsm(which stands for mortality adjusted value) The process generates values and indexes for valuing the cash position of any life insurance product or strategy, incorporating mortality probability. Essentially, Danielsen conceived of the idea that, at any point in time the future cash position of an IDB can only be one of three things:

Thus, Danielson derives the following formula:

Where:
Using his proprietary MAVsm technique, Danielsen uses Monte Carlo simulation to determine which of these three options the IDB will be for any given year in the future. “The process necessarily requires Monte Carlo simulation—I don’t believe what we do with MAVsm can be done with a static formula—it’s all probability-based,” says Danielsen. “It’s a perfect example of what the Palisade products can do.”

Because there is no way to know for certain when the insured person will die, Danielsen incorporates mortality probability data into @RISK, and runs the model several thousand times. The MAVsm method also incorporates myriad other variables, including product issues, investment decision, and tax attributes—all of which requires a workbook with 12 worksheets with over 250,000 active cells to take all these variables into account. “There’s no way the human brain can see the relationships among all these variables, all of which are probabilistic,” says Danielsen. “That’s why Monte Carlo simulation is so necessary for this work.”

For example, Figure 1 is a histogram of the results of a Monte Carlo simulation of the age at death for a 54-year old male non-smoker, using standard actuarial tables with certain adjustments. Because the age of death determines the MAVsm value at any point in the future, the probability of death at each age, as demonstrated in this simulation, can be combined with the other variables of the model to determine the statistical properties of the MAVsm value at any point in the future. These simulation results show that by year 20, for example, the probability is approximately 12 percent that the insured has died and that the death benefit has been received. It also shows that there is a 1% chance of the person died that year. Finally, the likelihood that he has not died yet, (and that life insurance still exists as its cash value), is 88%.

Figure 1: Mortality Probability by Age at Death (Male, age 54, nonsmoker): Graph shows probability of death in Year 20 is only ~1%, and the probability of death by Year 20 is ~12%

Along with this IDB data, the final result of Danielsen’s MAVsm model also includes a distribution of thousands of possibilities for the securities portfolio for each year of the strategy. Using this projection of probable values allows Danielsen to see the overall risk-adjusted returns of the portfolio.

With this information in-hand, Danielsen can then test the effect of different allocations of the portfolio to IDB to determine whether IDB is improving the probable risk-adjusted returns of the entire portfolio. Using RISKOptimizer, he also determines the optimal allocation of the portfolio to IDB, as well as the optimal allocation among a number of potential insureds in a multi-life case.

This approach to incorporating life insurance into financial portfolios is unprecedented, says Danielsen. “No one else is doing this, to the best of my knowledge,” he says. “When we show it to clients, they say they’ve never seen anything like this.” The alternative approach only compares a selected number of scenarios, based on static variable assumptions. “There is no way to evaluate the relative probability of the selected scenarios, much less combine the probabilities of the many variables in order to see the interdependent effects among the variables,” says Danielsen. “This made much of the planning a shot in the dark.”

Thanks to Palisade software, Danielsen has been able to make planning as precise as possible. “Palisade products have really enhanced these optimizing methods for me, and helped me create a much more sophisticated product.”

As the public becomes more aware of environmental issues and global warming, consumers are asking more questions about the products they are buying, with the result that a growing number of companies are considering the move to green manufacturing processes and supply chains. However, 'going green' can be a complex transition from an operational perspective, and not all companies are convinced that improved environmental performance will lead to financial gains. "Many companies are concerned that changing their established processes and implementing a green supply chain could result in lower quality products, delayed shipments, or even a loss of business," explained Dr. Sachin K. Mangla of the Indian Institute of Technology Roorkee and Graphic Era University - Dehradun.

A company's supply chain includes everything from purchasing, planning and managing the use of materials, to shipping and distributing the final product. Any delay or disruption can cause the supply chain to break down, which in turn can have a catastrophic impact on the business. This makes a company's decision to transition to a green supply chain (GSC) daunting, as it needs to identify, understand and evaluate all potential risks and consequences before 'going green' at an operational level. "Supply chain management is a key area of focus for businesses wanting to modernize and optimize their profits," said Mangla. "We wanted to build a risk analysis model that would help the increasing number of manufacturing companies that might be considering the move to a green supply chain."

With its headquarters in Sendai City in Miyagi Prefecture, Hitachi Solutions East Japan Ltd. is responsible for data and risk analysis of supply chain management, production, sales and inventory planning and management. They are also responsible for research and development of analytical methods and the creation of new products (including new packages and cloud-based services). For Hitachi, risk management means the analysis and repetition of potential risk factors, identification of risk, prioritization, and follow up action and monitoring.

There are a wide variety of risks in the manufacturing industry, including fluctuating variables such as demand, price, and foreign exchange. For demand fluctuation, for example, the amount of supply (production and procurement) is decided based on demand forecasts. If the forecast is wrong and there is a supply shortage, then the company will miss opportunities to increase sales. If there is too much supply, then there are inventory or disposal costs. Price and currency fluctuations are also important factors for risk management in the manufacturing sector, and the improvement of forecast accuracy through risk analysis is essential to mitigate these risks.

"@RISK enabled us to not only predict the type of risks that could happen, but also anticipate what risks were most likely to happen. This enabled us to create a model that provides companies with visibility into the potential ecological-economic gains of a green supply chain, as well as recommendations to best manage the operational risks."

Dr. Sachin K. Mangla
Indian Institute of Technology Roorkee and Graphic Era University

A Multi-Stage Process

To create the green supply chain (GSC) model, Mangla and the team interviewed several senior managers, IT managers and supply chain professionals within plastic manufacturing to identify operational risks that could disrupt the supply chain. These risks included machine, equipment or facility failure, process design problems, lack of skilled labour, system or software failure, and green technology inadequacy. After generating time-based effects for each of these risks to help measure the potential delays and disruptions to the GSC, the team identified the consequences for each of these risks: delays in production (time), materials and labor (cost), brand impact, health and safety, and product quality.

Mangla and the team knew they would need to run Monte Carlo simulation to address the wide range of uncertainties associated with these risks and consequences. The team analysed the data using Palisade's @RISK software, using Triangular distributions to determine the minimum, maximum and 'most likely' potential disruptions to the GSC. "We also found @RISK's sensitivity analysis feature extremely useful, as it provided an easy and efficient way to determine the average potential disruption of materials and products for each of the identified risks," said Mangla.

The results generated a wide variety of possible risk scenarios, as well as associated probabilities. "@RISK enabled us to not only predict the type of risks that could happen, but also anticipate what risks were most likely to happen," said Mangla. "This enabled us to create a model that provides companies with visibility into the potential ecological-economic gains of a green supply chain, as well as recommendations to best manage the operational risks."

The world of financial markets and investments is rife with risk and reward. This is particularly true for hedge funds, sophisticated investment vehicles that are typically used only by experienced investors and firms. They are named for the fact that investors ‘hedge,’ or attempt to protect their funds against volatile swings in the markets. But how are these portfolios managed? And are managers protecting and optimizing these funds as well as they should be? Seth Berlin, Principal Strategist at Performance Thinking & Technologies, a company that helps asset managers with investment operations, recently tackled these questions when working with a hedge fund client. With the help of Palisade’s @RISK and RISKOptimizer software, Berlin was able to create a quantitative model that uses simulation to optimize a hedge fund’s portfolio.

Finding the Sweet Spot

When managing a hedge fund portfolio, Berlin says, there has to be a balance between maximizing profit and minimizing risk. Modern portfolio management is based on the capital asset pricing model (CAPM). In CAPM, “You theoretically can maximize return by maximizing risk.” While taking on high risk can lead to high returns over the long term, “over the short term you would have large swings in your portfolio,” says Berlin. “Everyone can maximize profits but in the end you’re dead,” meaning some level of short-term stability is necessary.

The key, Berlin says, is to find an optimal portfolio that “maximizes return with the lowest incremental change in risk,” a sweet spot that avoids exorbitant risk while still yielding satisfactory returns. In his model, Berlin was simulating changes in capital markets assumptions and adjusting a mix of hedges to identify an optimal portfolio.

The Long and the Short of It

Berlin’s client dealt with a hedge fund portfolio comprising both long and short positions. Long positions are assets that are owned by the fund, such as stocks, which investors hope will increase in value. Short positions are ‘borrowed assets’ sold short, which increase in value if the price of the underlying assets goes down. Hedge funds use combinations of long and short positions to protect against equity market movements, currency rate changes, interest rate risk, and credit worthiness. “I’ve worked with many hedge fund managers on how they manage hedges. Anytime you can move to more quantitative than qualitative management based on simulation and optimization, it’s a win for both managers and investors.”

"If I have fifty things I want to allocate money to, and I have choices on how to allocate it, you never know the right mix. RISKOptimizer and @RISK shuffle the deck a million times, and let you see how all those different combinations could play out."

Seth Berlin
Principal Strategist, Performance Thinking & Technologies

Turning an Art into Science

Berlin used Palisade’s risk analysis software, including @RISK and RISKOptimizer, to prove to his clients that quantitative measures of simulation could help identify an optimal portfolio, rather than the typical qualitative approach. He used a step-wise process to develop his quantitative model:

Simulations and Smarts

After going through this five-step process, Berlin had results that indicated how the hedge fund managers could change their profitability by altering the mix of long and short positions in the portfolio. “RISKOptimizer gave me an optimal mix of my short positions for the next week. Since this is theoretical, you then compare your results with what really happened in the next week,” he says.

These results are helpful, but of course, they are not the only answer. “I’m not an alchemist. I’m not coming up with a secret sauce that is the model for managers to make money,” says Berlin. “What I was trying to do overall is to move from a qualitative to a quantitative way to judge a problem.” He explains that managing a hedge fund will always require qualitative decisions-- “that’s what experience is”--but that well done simulations provide a method to test one’s intuition and judgment.

An Underused Tool

This method Berlin used has broad applications, and could be used by anyone hedging a portfolio. Yet the majority of investors don’t apply such rigorous tests. “I think everyone would say they’re doing some kind of modeling, but for the most part it doesn’t involve simulation and optimization.” Berlin believes turning to a tool like RISKOptimizer can give some certainty and structure to the process. “Based on your capital market inputs, RISKOptimizer helps you look at different mixes of assets to reach a better risk-return structure,” he says. “If I have fifty things I want to allocate money to, and I have choices on how to allocate it, you never know the right mix. RISKOptimizer and @RISK shuffle the deck a million times, and let you see how all those different combinations could play out.”

With Enterprise Resource Planning a nearly universal but continually evolving business model, consultant Don Mettler fulfills a vital role for the large manufacturing companies that are his clients. He specializes in integrating software systems to help these companies make the most efficient use of their plant and material resources. For the past fifteen years he has worked with companies in businesses that range from pharmaceuticals to aerospace and automotive to high-tech hardware--and Palisade’s Evolver, @RISK, and the @RISK Developer Kit have been essential to much of this work.

To create the models that guide company efficiencies, Mettler needs to link the company’s backend database, business intelligence software, and, sometimes, legacy systems to Palisade’s tools. His goal in developing a custom system is to build an optimization model that he can eventually turn over to his client to manage. “That’s one reason I like to use the Palisade tools. They run in Excel, and clients can easily understand the results.”

A Typical Manufacturing Example

For one recent assignment, Mettler developed a model to optimize the production and minimize inventory costs of corrugated cardboard boxes for fresh produce. Demand for the product was highly volatile, while plant capacity and the number of manufacturing machines were fixed. Ideally, inventory levels would respond to ordering information, and his model would have to account for thousands of inputs from the company’s ERP and business intelligence software.

In response to these challenges, Mettler first built a prototype model using RISKOptimizer and later used @RISK and Evolver to build the final model installed on his client’s system. He used @RISK to forecast sales data and Evolver to optimize the model outputs for production. The optimization results were stored in the company’s database and could be re-accessed by his model during a simulation run.

"Every manufacturing company should use scheduling optimization. You have a finite set of resources, and you want to maximize gain while minimizing the cost of handling those resources. It’s a problem just made for genetic programming and Evolver."

Don Mettler
Consultant

Real-Time Optimization for Real-Time Decisions

Mettler’s model ran--and under the supervision of plant managers is still running--every night on a fifty-two week schedule, continually adjusting box production and inventory to daily orders. The company is manufacturing only boxes it will ship that week and has reduced the costs associated with inventory.

“Every manufacturing company should use scheduling optimization,” says Mettler. “You have a finite set of resources, and you want to maximize gain while minimizing the cost of handling those resources. It’s a problem just made for genetic programming and Evolver.”

Metallurgical giant Met-Mex Peñoles uses the DecisionTools Suite for Six Sigma Design of Experiments. Because silver and gold are so expensive, process optimization allows analysts to test innovations, avoiding costly trial runs.

DecisionTools Suite in Six Sigma Design of Experiments

Because of the costliness of its raw materials, the metallurgical giant Met-Mex Peñoles, the world’s largest refiner of silver and Mexico’s largest refiner of gold, tries to avoid expensive pilot projects. To cut down on the number of trial runs, the company simulates the refining process by using the DecisionTools Suite in Six Sigma Design of Experiments. This allows the company to work on process optimization and sacrifice a minimum of gold and silver to its experiments.

According to Ignacio Quijas, technology manager for Peñoles, “When you are working with silver and gold, pilot projects to test innovations in the manufacturing process are very costly—and risky. Using @RISK to simulate changes in process design allows us to answer some difficult questions without actually running trials of the process.”

To offer some perspective on the exacting standards Peñoles must meet, Ignacio points out that, for instance, a 100-ounce silver bar must weigh at least 100 ounces—however, the price of the bar does not increase if the bar weighs slightly more than the specification. The additional silver is simply passed along free to the customer and is a production cost.

Each step in the manufacturing processes for gold and silver value-added products creates room for additional error, and the way Peñoles optimizes its process is to reduce the variability of the errors across the manufacturing steps. To build its Six Sigma simulation, Peñoles inputs the physical measurements of the errors and also feeds into the model the specifications and tolerances of its manufacturing equipment, different physical operations, random processing errors, and cost analyses that are pinpoint precise. “We are measuring the amount of gold an silver that turns up when we do every operation and can result in loses,” Ignacio reports.

"When you are working with silver and gold, pilot projects to test innovations in the manufacturing process are very costly—and risky. Using @RISK to simulate changes in process design allows us to answer some difficult questions without actually running trials of the process."Ignacio Quijas
Technology Manager, Met-Mex Peñoles

His primary simulation tool is @RISK. “The functionality gives you so much more vision.” But because of the need for precision in his simulation, Ignacio also makes extensive use of @RISK distribution fitting and TopRank. He likes their capacities for graphic representation, which he uses to explain the intricacies of his simulations to colleagues.

“The ability to communicate aspects of the simulation is important,” he says, “because these very detailed models serve the same function as early trial runs.”

Yes, he said, Peñoles does still rely on pilot projects, but only after DecisionTools has accounted for every speck of silver and gold that might result in the production losses or recycled materials that increase cost.

Veteran oil and gas industry analyst and decision-making expert Steve Begg garnered some global media attention prior to the FIFA World Cup when he used his expertise with @RISK to build a model to simulate outcomes for the popular, month-long international tournament.

Background

Steve Begg has spent most of his career dealing with uncertainty in the oil and gas business, initially with BP and Halliburton and lately on the teaching side of the industry in his adopted home of Australia. But the native of Northern Ireland recently took a temporary, and arguably more fun, detour when he combined his lifelong passion for soccer and his expertise with Monte Carlo simulation to build an @RISK-powered model to generate outcomes for the 2018 FIFA World Cup.

A professor and former Head of the University of Adelaide’s School of Petroleum whose research and teaching focus on decision-making under uncertainty and the psychological and judgmental factors that influence it, Begg created a probability model to estimate the chances of particular outcomes occurring during the tournament. Though the difference between the uncertainty in a team’s playing ability and their chance of winning is perhaps a subtle one to many, it is one of the things that make Begg’s exercise unique.

“The difficulty is in knowing how to propagate uncertainty in something we can assess, the teams’ playing ability, through to an assessment of their chance of advancing to various stages of the tournament, ultimately to the final. This is what Monte Carlo simulation enables us to do.”

In the end, the model calculated the highest probabilities of winning for world number two-ranked Brazil, with 15.4%, and not number one-ranked and defending champion Germany (13.32%). Of the top 10 teams ranked by Begg’s model, only Germany (part of Group F, the consensus “Group of Death’) and tenth-ranked Poland failed to advance to the Round of 16.

An “Uncertain” Approach

“The outcomes of many decisions we make are uncertain because of lack of information and things outside of our control,” Begg says. “Uncertainty is crucial in predicting the chance of an oil or gas field being economic. In the World Cup, it determines the many ways the whole tournament might play out – there are nearly 430 million possible outcomes of the Group Stage alone. What makes it so hard to predict is not just uncertainty in how a team will perform in general, but random factors that can occur in each match.”

Begg’s approach was to model enough possibilities to estimate the chance of any particular team progressing. FIFA world rankings, essentially determined through a relatively simple system in which points are earned through victories on the pitch, are but one part of predicting a team’s success in a tournament, and a rather simplistic one at that. Due to the complex World Cup tournament format, which places the qualifying teams in eight groups of varying difficulty, with prescribed rules as to how the winners progress, Begg constructed a sophisticated model that incorporated both the known (tournament structure) and the unknown or uncertain (team performances). The latter included what he called “tournament form” (how well a team will play, on average, over the course of the finals) and “match form” (the extent to which the team plays better or worse than its tournament form in any given match).

"From an experienced-user perspective, I really liked being able to use an @RISK function, just like any other Excel functions, without having to go through a series of input screens or boxes."Dr. Steve Begg
School of Petroleum, University of Adelaide

PERT Distribution

For each of his 100,000 simulations, Begg used the PERT probability distribution function (PDF) to describe uncertainty in tournament form. “The PERT distribution is easy to use because it just requires three numbers, a minimum, maximum and most likely,” Begg says.

The “most likely” value was derived from FIFA rankings over the past four years, supplemented by Begg’s own knowledge of international soccer to account for factors like recent “friendlies” played. The biggest change he made was to give Russia a higher score than its FIFA ranking suggested, due to its home advantage. (With a victory over heavily favored Spain in a penalty kicks shootout July 1, Russia moved on to the quarterfinals, which the model gave it a 10.9% probability of doing).

The PERT minimum and maximum values were assigned based on the most likely values. For lower-ranking teams, Begg skewed the distribution upwards based on the theory that they have a greater chance of playing better than their rankings suggest (as it turned out for Russia and Japan) than playing worse. The higher-ranking teams had the reverse – a greater chance of playing worse than their ranking (as it turned out for Germany and Argentina) than outperforming them. Middle ranked teams had a more symmetrical distribution.

For each match, each team’s “match form” was drawn from a truncated normal Probability Distribution Function (PDF) whose “mean” was that simulation’s tournament form, with a standard deviation of 1/10th the mean.

Begg then assigned the total number of goals scored in a match from a discrete PDF, derived from the number of goals scored in all of the matches played in the last three World Cups. The total goals were then divided between the two teams based on their relative match form. In the 100,000 simulations of the event’s first match, which saw Russia defeat Saudi Arabia by the unusually high score of 5-0, the model picked that exact score 91 times.

For the Group Stage, the order in the table (including goal difference and goals scored) was computed and the top two teams moved on to the next round according to the competition rules. The same process was used for all subsequent rounds. If there was a draw (tie) in a later round, then the winner of the penalty shootout was drawn from a Bernoulli PDF (featuring discrete, random variables and having only two outcomes – success/failure) with a mean of the teams’ relative form.

“From an experienced-user perspective, I really liked being able to use an @RISK function, just like any other Excel function, without having to go through a series of input screens or boxes.” Steve Begg, University of Adelaide.

Degrees of Belief

Begg stored all of the winners after each round in order to calculate the probabilities of a team progressing based on 100,000 simulations (one million simulations produced no significant differences) – which he says took only five minutes on his laptop. He also calculated the probability of the World Cup Final being between any two teams.

“Its important to realize that probability is subjective. It depends on what information you have. There’s this tendency for people who do this kind of work to obsess on data,” Begg says. “You might argue that these simulations are the most useful when you have no data at all. But you do need to understand your uncertain quantities well enough to assign a probability distribution that reflects your degree of belief in what the outcomes might be. What’s crucial is that neither the information nor your reasoning is biased.”

Although he’ll continue updating his model until the World Cup ends on July 15, Begg is already back at his paying job, where he’s been using @RISK since the mid-1990s for technical and business uncertainty assessments to support decision-making.

“At one point the nature of my work changed to things that could be tackled in spreadsheets, like economic evaluations and simple production models, so I adopted @RISK to model their uncertainty. From an experienced-user perspective, I really liked being able to use an @RISK function, just like any other Excel function, without having to go through a series of input screens or boxes,” Begg says.

“When I teach Monte Carlo simulation I do it native in Excel, so that my students in industry and at the University can see how easy it is and that there is nothing mysterious about the process – but it is cumbersome. They are then delighted to find out how much quicker and simpler it is to do it with @RISK.”

magnifierarrow-right
linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram