Hedging Iceberg Lettuce Production: Palisade Software Determines Best Method for Managing Margins in Agriculture

Many California produce farm operations use a rule-of-thumb to determine a hedge ratio for their seasonal productions. They often aim to contract 80% of their crop in advance to buyers at set prices, leaving the remaining 20% to be sold at spot prices in the open market. The rationale for this is based on many years of experience that indicates costs and a reasonable margin can be covered with 80% of production hedged by forward contracts. The hope is the remaining 20% of production will attract high prices in favorable spot markets, leading to substantial profits on sales. Of course, it is understood spot prices might not be favorable, in which case any losses could be absorbed by the forward sales.

Since the Recession of 2008, agricultural lenders and government regulators have recognized that many farm operators need to manage the risks to their margins, and free cash flows, rather than simply focusing revenue risks. A more quantitative analysis is needed to determine risks in the agricultural industry.

Agribusiness experts from Cal Poly conducted a risk management analysis using @RISK, and found the 80% hedge ratio rule-of-thumb is not as effective as assumed. Growers do not profit from spot market sales over the long run. The analysis shows growers are better off in the long-term selling as much of their product as possible using forward contracts.

Background

Agriculture in California is big business. In 2013, nearly 80,000 farms and ranches produced over 400 commodities – the most valuable being dairy, almonds, grapes, cattle, and strawberries – worth $46.4 billion. Almost half of this value came from exports. The state grows nearly half of the fruits, nuts, and vegetables consumed in the United States. Yet agriculture is traditionally one of the highest risk economic activities.

Steven Slezak, a Lecturer in the Agribusiness Department at Cal Poly, and Dr. Jay Noel, the former Agribusiness Department Chair, conducted a case study on an iceberg lettuce producer that uses the rule-of-thumb approach to manage production and financial risks. The idea was to evaluate the traditional rule-of-thumb method and compare it to a more conservative hedging strategy.

Hedging Bets on Iceberg Lettuce Sales

The grower uses what is known as a ‘hedge’ to lock in a sales price per unit for a large portion of its annual production. The hedge consists of a series of forward contracts between the grower and private buyers which set in advance a fixed price per unit. Generally, the grower tries to contract up to 80% of production each year, which stabilizes the grower’s revenue stream and covers production costs, with a small margin built in.

The remaining 20% is sold upon harvest in the ‘spot market’ – the open market where prices fluctuate every day, and iceberg lettuce can sell at any price. The grower holds some production back for spot market sales, which are seen as an opportunity to make large profits. “The thinking is, when spot market prices are high, the grower can more than make up for any losses that might occur in years when spot prices are low,” says Slezak. “We wanted to see if this is a reasonable assumption. We wanted to know if the 80% hedge actually covers costs over the long-term and if there are really profits in the spot market sales. We wanted to know if the return on the speculation was worth the risk. We found the answer is ‘No’.”

This is important because growers often rely on short-term borrowing to cover operational costs each year. If free cash flows dry up because of operational losses, growers become credit risks, some cannot service their debt, agricultural lending portfolios suffer losses, and costs rise for everybody in the industry. Is it a sound strategy to swing for the fences in the expectation of gaining profits every now and then, or is it better to give up some of the upside to stabilize profits over time and to reduce the probability of default resulting from deficient cash flows?

Combining Costs and Revenues in @RISK

Slezak and Noel turned to @RISK to determine an appropriate hedge ratio for the grower.

For inputs, they collected data on cultural and harvest costs. Cultural costs are the fixed costs “necessary to grow product on an acre of land,” such as seeds, fertilizer, herbicides, water, fuel etc., and tend to be more predictable. The researchers relied on the grower’s historical records and information from county ag commissioners for this data.

Harvest costs are much more variable, and are driven by each season’s yield. These costs include expenses for cooling, palletizing, and selling the produce. To gather data on harvest costs for the @RISK model, Slezak and Noel took the lettuce grower’s average costs over a period of years along with those of other producers in the area, and arrived at an average harvest cost per carton of iceberg lettuce. These costs were combined with overhead, rent, and interest costs to calculate the total cost per acre. Cost variability is dampened due to the fact that fixed costs are a significant proportion of total costs, on a per acre basis.

The next inputs were revenue, which are defined as yield per acre multiplied by the price of the commodity. Since cash prices vary, the grower’s maximum and minimum prices during the previous years were used to determine an average price per carton. Variance data were used to construct a distribution based on actual prices, not on a theoretical curve.

To model yield, the grower’s minimum and a maximum yields over the same period were used to determine an average. Again, variance data were used to construct a distribution based on actual yields.

Palisade StatTools was used to create these distribution parameters. @RISK was used to create a revenue distribution and inputs for the model. With cost and revenue simulation completed, the study could turn next to the hedge analysis.

"A finance professor brought the software in one day and said, ‘if you learn this stuff you’re going to make a lot of money,’ so I tried it out and found it to be a very useful tool."

Steven Slezak
Aribusiness Department, Cal Poly University

To Hedge, or Not to Hedge?

Since the question in the study is about how best to manage margin risk – the probability that costs will exceed revenues – to the point where cash flows would be insufficient to service debt, it was necessary to compare various hedge ratios at different levels of debt to determine their long-term impact on margins. @RISK was used to simulate combinations of all costs and revenue inputs using different hedge ratios between 100% hedging and zero hedging. By comparing the results of these simulation in terms of their effect on margins, it was possible to determine the effectiveness of the 80% hedging rule of thumb and the value added by holding back 20% of production for spot market sales.

Unsurprisingly, with no hedge involved, and all iceberg lettuce being sold on the sport market, the simulation showed that costs often exceeded revenues. When the simulation hedged all production, avoiding spots sales completely, the costs rarely exceeded revenues. Under the 80% hedge scenario, revenues exceeded costs in most instances, but the probability of losses significant enough to result in cash flows insufficient to service debt was uncomfortably high.

It was also discovered that the 20% of production held back for the purpose of capturing high profits in strong markets generally resulted in reduced margins. Only in about 1% of the simulations did the spot sales cover costs, and even then the resulting profits were less than $50 per acre. Losses due to this speculation could be as large as $850 per acre. A hedging strategy designed to yield home runs instead resulted in a loss-to-gain ratio of 17:1 on the unhedged portion of production.

Slezak and his colleagues reach out to the agribusiness industry in California and throughout the Pacific Northwest to educate them on the importance of margin management in an increasingly volatile agricultural environment. “We’re trying to show the industry it’s better to manage both revenues and costs, rather than emphasizing maximizing revenue,” he says. “While growers have to give up some of the upside, it turns out the downside is much larger, and there is much more of a chance they’ll be able to stay in business.”

In other words, the cost-benefit analysis does not support the use of the 80% hedged rule-of-thumb. It’s not a bad rule, but it’s not an optimal hedge ratio.

Early @RISK Adopter
Professor Slezak is a long-time user of Palisade products, having discovered them in graduate school. In 1996, “a finance professor brought the software in one day and said, ‘if you learn this stuff you’re going to make a lot of money,’ so I tried it out and found it to be a very useful tool,” he says. Professor Slezak has used @RISK to perform economic and financial analysis on a wide range of problems in industries as diverse as agribusiness, energy, investment management, banking, interest rate forecasting, education, and in health care.

 

Identifying Optimum Mitigation Strategies for Debris Flow Hazards with PrecisionTree

Application

Modeling the frequency and magnitude of future debris flows to determine the optimum hazard mitigation strategy. Communicating risk to clients by displaying the probability of event paths for three decisions:

  1. Existing conditions
  2. Constructing a containment dam
  3. Relocating existing residences

Summary

Duncan Wyllie, a Principal of Wyllie & Norrish Rock Engineers, uses the Palisade software PrecisionTree for probabilistic modeling of debris flow protection measures.

When analyzing the optimum method of protecting an area at risk from debris flows, three decisions are compared – accepting existing conditions, constructing a containment dam with sufficient capacity to contain future flows, or relocating residences on the debris flow runout area. Creating probabilistic decision trees in PrecisionTree allows uncertainties in the frequency and magnitude of future debris flows to be analyzed, and for comparison of costs between constructing a dam and relocating the residences.

Background

Wyllie & Norrish Rock Engineers, with offices in Seattle and Vancouver, Canada, is a specialist engineering company working in the fields of landslides, tunnels, slopes, and foundations. Duncan Wyllie and Norman Norrish, the company principals, have a combined total of 80 years of experience in applied rock mechanics.

Since the 1990s, Wyllie and Norrish have been utilizing Palisade software to analyze natural hazards and select hazard mitigation procedures.

Using Palisade Products

When a potential debris flow hazard is located above a residential development, PrecisionTree can be used to create a probabilistic decision tree that maps out possible scenarios, the likelihood they will occur, and the estimated damage costs. Three decisions are compared – existing conditions, constructing a debris flow dam, or evacuating the debris flow runout area.

"If we use @RISK and PrecisionTree to present results, people can make rational decisions as to what structural protection to install."Duncan Wyllie
Principal of Wyllie & Norrish Rock Engineer

Debris Flow Dam Decision Tree Example

With reference to the decision tree shown below, the components of the analysis are as follows:

For a closer look, download our free Debris Flow Containment Dam example model.

Analysis shows that the optimum decision is to construct a containment dam because the total cost of mitigation plus the expected cost (EV) of damage is lower for the dam construction (EVΣdam = $200,150) than for existing conditions (EVΣexisting = $360,000) or for relocating the houses (EVΣhouses = $2,000,600).

Results

The use of PrecisionTree allows possible mitigation measures, along with the probability of event occurrence and cost, to be analyzed. The analysis unambiguously identifies the most cost-effective mitigation measure, and the decision process is clearly mapped out in the decision tree.

A Competitive Edge

The use of @RISK and PrecisionTree software to prepare decision trees modeling all potential outcomes enables Wyllie & Norrish Rock Engineers to quantitatively determine the optimum protection strategy and easily communicate the findings.

With Palisade’s products, Wyllie & Norrish Rock Engineers can:

By using probabilistic analysis, Wyllie & Norrish Rock Engineers ensure that the best decision is reached for each at-risk area and if necessary, effective debris flow dams are created to protect nearby structures.

Free Example Models

Download our free example model, Decision Trees in Geotechnical Engineering, to explore three decision tree examples from the geotechnical engineering field: debris flow containment dam, rock slope stabilization, and gravity dam reinforcement anchors.

Leveraging Probabilistic Analysis to Improve Rock Fall Protection Structure Designs with @RISK and PrecisionTree

Application

Modeling rock fall masses, trajectories, velocities, and energies to design rock fall protection structures. Communicating risk to clients by calculating impact energy probability so that the impact capacity of the protection structure can be matched to the consequence of an accident.

Summary

Duncan Wyllie, a Principal of Wyllie & Norrish Rock Engineers, uses @RISK and PrecisionTree for probabilistic modeling of rock fall hazards. The analyses incorporate uncertainty in values of the mass and velocity that are expressed as probability distributions, and Monte Carlo simulation in @RISK is used to calculate the probability distribution of the impact energy. The energy calculations are then used to design protection structures such as concrete sheds and wire rope fences with the appropriate impact energy capacity to suit the possible consequences of an accident. Decision analysis using PrecisionTree is then applied to determine the optimum mitigation strategy.

In the example, Wyllie explains how Palisade software is used to calculate potential rock fall impact energies and make recommendations for a fence to protect traffic at the base of a steep mountain slope.

Background

Wyllie & Norrish Rock Engineers, with offices in Seattle and Vancouver, Canada, is a specialist engineering company working in the fields of rock slopes, tunnels, blasting, foundations, landslides, and rock falls. Duncan Wyllie and Norman Norrish, the company principals, have a combined total of 80 years of experience in applied rock mechanics.

Since the 1990s, Wyllie and Norrish have been utilizing Palisade software to analyze rock fall risks and hazards, and design rock fall protection structures. They provide hazard mitigation services for rock falls in mountainous areas by identifying rock fall sources, modelling rock fall trajectories and energies, and designing customized protection structures. Projects have been undertaken for highways, railways, hydro-electric power plants, and residential developments.

Using Palisade Products

For most rock fall projects, very limited, or no, information is available from previous events. In these circumstances, uncertainty exists in the design parameters of rock fall frequency, mass, velocity and trajectory. These uncertainties can be quantified using @RISK to define probability distributions that account for the possible range of values, and the most likely values, based on judgement and experience.

Wyllie found that the BetaGeneral and Pert distributions incorporated in @RISK provide the optimum models for these conditions. Multiplication of mass by the square of the velocity distributions gives the impact energy that is also defined by a probability distribution. This information can be used to design protection structures that reduce the hazard to an acceptable level of societal risk.

Another component of risk management for rock fall projects is to implement decision analysis in which alternative courses of action, such as construction of a high strength containment fence or of a less expensive ditch, can be compared. This analysis can be carried out using PrecisionTree in which the sum of construction costs and expected value of an accident (i.e., the product of an accident cost and its probability) can be compared for each course of action. PrecisionTree allows rapid analysis of these alternatives and incorporates sensitivity analyses that show how uncertainty in values of the costs and probabilities influence the selection of the optimum action.

A particular value of analyses using @RISK and PrecisionTree is that it is possible to define low risk but high consequence events that have a low expected value, such as a large-scale landslide. Comparison of this event with more frequently occurring, but less costly rock falls will show if the optimum mitigation measure is to stabilize the landslide or contain the rock falls. The analyses often show that very rare events are acceptable.

"If we use @RISK and PrecisionTree to present results, people can make rational decisions as to what structural protection to install."

Duncan Wyllie
Principal of Wyllie & Norrish Rock Engineer

Rock Fall Modeling Example

When a rock fall source on a steep mountain slope was identified above a road, Wyllie used @RISK to calculate the probability distribution of the potential rock fall impact energies that would be used to design a protection structure. Because the site had no history of rock falls that could be used for design, @RISK was used to analyze the geology and empirical equations for velocity to develop BetaGeneral distributions for the mass and velocity, from which the distribution for the impact energy was calculated. These plots are shown below where the maximum, minimum, and mean values for mass, velocity, and energy are indicated.

These results were discussed with the owner to determine an appropriate design energy capacity for a fence to protect the road from rock falls.

“Because access to the road was restricted and traffic was infrequent, it was decided that a rock fall event with high energy but low probability was acceptable,” said Wyllie. “The design energy of 1250 kJ was selected such that about 90% of falls would be contained, with the understanding that the low probability of an event with an energy exceeding 1250 kJ was acceptable.”

The image below shows the installed Attenuator fence.

In comparison, if the fence was installed above a busy Interstate highway where the consequence of a high energy rock fall event could be severe, it is likely that the design energy would be about 2500 kJ to 3000 kJ to ensure that almost all rock falls would be contained.

“If we use @RISK and PrecisionTree to present results, people can make rational decisions as to what structural protection to install,” said Wyllie.

Results

Thanks to the probabilistic analysis conducted by Wyllie, the road has a fence in place that can withstand an impact energy of 1250 kJ that will contain about 90% of future rock falls. Those traveling on the road can have peace of mind knowing the hazard mitigation structure was designed through quantitative analysis.

A Competitive Edge

The use of @RISK and PrecisionTree software to prepare designs where many or all of the design parameters are uncertain allows Wyllie to quantitatively determine the best mitigation strategy.

With Palisade’s products, Wyllie & Norrish Rock Engineers can:

By using deterministic analysis, Wyllie & Norrish Rock Engineers ensure that effective hazard mitigation structures are in place to protect people, facilities, and infrastructure.

DecisionTools Optimizes Precious Metal Refining

Metallurgical giant Met-Mex Peñoles uses the DecisionTools Suite for Six Sigma Design of Experiments. Because silver and gold are so expensive, process optimization allows analysts to test innovations, avoiding costly trial runs.

DecisionTools Suite in Six Sigma Design of Experiments

Because of the costliness of its raw materials, the metallurgical giant Met-Mex Peñoles, the world’s largest refiner of silver and Mexico’s largest refiner of gold, tries to avoid expensive pilot projects. To cut down on the number of trial runs, the company simulates the refining process by using the DecisionTools Suite in Six Sigma Design of Experiments. This allows the company to work on process optimization and sacrifice a minimum of gold and silver to its experiments.

According to Ignacio Quijas, technology manager for Peñoles, “When you are working with silver and gold, pilot projects to test innovations in the manufacturing process are very costly—and risky. Using @RISK to simulate changes in process design allows us to answer some difficult questions without actually running trials of the process.”

To offer some perspective on the exacting standards Peñoles must meet, Ignacio points out that, for instance, a 100-ounce silver bar must weigh at least 100 ounces—however, the price of the bar does not increase if the bar weighs slightly more than the specification. The additional silver is simply passed along free to the customer and is a production cost.

Each step in the manufacturing processes for gold and silver value-added products creates room for additional error, and the way Peñoles optimizes its process is to reduce the variability of the errors across the manufacturing steps. To build its Six Sigma simulation, Peñoles inputs the physical measurements of the errors and also feeds into the model the specifications and tolerances of its manufacturing equipment, different physical operations, random processing errors, and cost analyses that are pinpoint precise. “We are measuring the amount of gold an silver that turns up when we do every operation and can result in loses,” Ignacio reports.

"When you are working with silver and gold, pilot projects to test innovations in the manufacturing process are very costly—and risky. Using @RISK to simulate changes in process design allows us to answer some difficult questions without actually running trials of the process."Ignacio Quijas
Technology Manager, Met-Mex Peñoles

His primary simulation tool is @RISK. “The functionality gives you so much more vision.” But because of the need for precision in his simulation, Ignacio also makes extensive use of @RISK distribution fitting and TopRank. He likes their capacities for graphic representation, which he uses to explain the intricacies of his simulations to colleagues.

“The ability to communicate aspects of the simulation is important,” he says, “because these very detailed models serve the same function as early trial runs.”

Yes, he said, Peñoles does still rely on pilot projects, but only after DecisionTools has accounted for every speck of silver and gold that might result in the production losses or recycled materials that increase cost.

Probability Outcomes in the 2018 FIFA World Cup

Veteran oil and gas industry analyst and decision-making expert Steve Begg garnered some global media attention prior to the FIFA World Cup when he used his expertise with @RISK to build a model to simulate outcomes for the popular, month-long international tournament.

Background

Steve Begg has spent most of his career dealing with uncertainty in the oil and gas business, initially with BP and Halliburton and lately on the teaching side of the industry in his adopted home of Australia. But the native of Northern Ireland recently took a temporary, and arguably more fun, detour when he combined his lifelong passion for soccer and his expertise with Monte Carlo simulation to build an @RISK-powered model to generate outcomes for the 2018 FIFA World Cup.

A professor and former Head of the University of Adelaide’s School of Petroleum whose research and teaching focus on decision-making under uncertainty and the psychological and judgmental factors that influence it, Begg created a probability model to estimate the chances of particular outcomes occurring during the tournament. Though the difference between the uncertainty in a team’s playing ability and their chance of winning is perhaps a subtle one to many, it is one of the things that make Begg’s exercise unique.

“The difficulty is in knowing how to propagate uncertainty in something we can assess, the teams’ playing ability, through to an assessment of their chance of advancing to various stages of the tournament, ultimately to the final. This is what Monte Carlo simulation enables us to do.”

In the end, the model calculated the highest probabilities of winning for world number two-ranked Brazil, with 15.4%, and not number one-ranked and defending champion Germany (13.32%). Of the top 10 teams ranked by Begg’s model, only Germany (part of Group F, the consensus “Group of Death’) and tenth-ranked Poland failed to advance to the Round of 16.

An “Uncertain” Approach

“The outcomes of many decisions we make are uncertain because of lack of information and things outside of our control,” Begg says. “Uncertainty is crucial in predicting the chance of an oil or gas field being economic. In the World Cup, it determines the many ways the whole tournament might play out – there are nearly 430 million possible outcomes of the Group Stage alone. What makes it so hard to predict is not just uncertainty in how a team will perform in general, but random factors that can occur in each match.”

Begg’s approach was to model enough possibilities to estimate the chance of any particular team progressing. FIFA world rankings, essentially determined through a relatively simple system in which points are earned through victories on the pitch, are but one part of predicting a team’s success in a tournament, and a rather simplistic one at that. Due to the complex World Cup tournament format, which places the qualifying teams in eight groups of varying difficulty, with prescribed rules as to how the winners progress, Begg constructed a sophisticated model that incorporated both the known (tournament structure) and the unknown or uncertain (team performances). The latter included what he called “tournament form” (how well a team will play, on average, over the course of the finals) and “match form” (the extent to which the team plays better or worse than its tournament form in any given match).

"From an experienced-user perspective, I really liked being able to use an @RISK function, just like any other Excel functions, without having to go through a series of input screens or boxes."Dr. Steve Begg
School of Petroleum, University of Adelaide

PERT Distribution

For each of his 100,000 simulations, Begg used the PERT probability distribution function (PDF) to describe uncertainty in tournament form. “The PERT distribution is easy to use because it just requires three numbers, a minimum, maximum and most likely,” Begg says.

The “most likely” value was derived from FIFA rankings over the past four years, supplemented by Begg’s own knowledge of international soccer to account for factors like recent “friendlies” played. The biggest change he made was to give Russia a higher score than its FIFA ranking suggested, due to its home advantage. (With a victory over heavily favored Spain in a penalty kicks shootout July 1, Russia moved on to the quarterfinals, which the model gave it a 10.9% probability of doing).

The PERT minimum and maximum values were assigned based on the most likely values. For lower-ranking teams, Begg skewed the distribution upwards based on the theory that they have a greater chance of playing better than their rankings suggest (as it turned out for Russia and Japan) than playing worse. The higher-ranking teams had the reverse – a greater chance of playing worse than their ranking (as it turned out for Germany and Argentina) than outperforming them. Middle ranked teams had a more symmetrical distribution.

For each match, each team’s “match form” was drawn from a truncated normal Probability Distribution Function (PDF) whose “mean” was that simulation’s tournament form, with a standard deviation of 1/10th the mean.

Begg then assigned the total number of goals scored in a match from a discrete PDF, derived from the number of goals scored in all of the matches played in the last three World Cups. The total goals were then divided between the two teams based on their relative match form. In the 100,000 simulations of the event’s first match, which saw Russia defeat Saudi Arabia by the unusually high score of 5-0, the model picked that exact score 91 times.

For the Group Stage, the order in the table (including goal difference and goals scored) was computed and the top two teams moved on to the next round according to the competition rules. The same process was used for all subsequent rounds. If there was a draw (tie) in a later round, then the winner of the penalty shootout was drawn from a Bernoulli PDF (featuring discrete, random variables and having only two outcomes – success/failure) with a mean of the teams’ relative form.

“From an experienced-user perspective, I really liked being able to use an @RISK function, just like any other Excel function, without having to go through a series of input screens or boxes.” Steve Begg, University of Adelaide.

Degrees of Belief

Begg stored all of the winners after each round in order to calculate the probabilities of a team progressing based on 100,000 simulations (one million simulations produced no significant differences) – which he says took only five minutes on his laptop. He also calculated the probability of the World Cup Final being between any two teams.

“Its important to realize that probability is subjective. It depends on what information you have. There’s this tendency for people who do this kind of work to obsess on data,” Begg says. “You might argue that these simulations are the most useful when you have no data at all. But you do need to understand your uncertain quantities well enough to assign a probability distribution that reflects your degree of belief in what the outcomes might be. What’s crucial is that neither the information nor your reasoning is biased.”

Although he’ll continue updating his model until the World Cup ends on July 15, Begg is already back at his paying job, where he’s been using @RISK since the mid-1990s for technical and business uncertainty assessments to support decision-making.

“At one point the nature of my work changed to things that could be tackled in spreadsheets, like economic evaluations and simple production models, so I adopted @RISK to model their uncertainty. From an experienced-user perspective, I really liked being able to use an @RISK function, just like any other Excel function, without having to go through a series of input screens or boxes,” Begg says.

“When I teach Monte Carlo simulation I do it native in Excel, so that my students in industry and at the University can see how easy it is and that there is nothing mysterious about the process – but it is cumbersome. They are then delighted to find out how much quicker and simpler it is to do it with @RISK.”

George Washington University Researchers Use @RISK and RISKOptimizer for Informed Decisions Around Debt Portfolios and Capital Investments

To address complicated debt portfolio and capital investment problems, Dr. Emmanuel Donkor uses Palisade software @RISK and RISKOptimizer. His research has led to new and better methods for addressing financial statistical problems, using @RISK.

When multiple sources of debt with different financing terms are available to the decision maker–for example, a ten-year loan with 5% interest, and a 5-year loan with 10% interest–the environment is fraught with uncertainty. “The challenge is to determine an appropriate or optimal mix of these different debt sources that simultaneously maximizes project value and reduces the risk of default,” explains Dr. Emmanuel Donkor, a quantitative analyst and faculty member of the School of Engineering and Applied Sciences (SEAS) at George Washington University. To address complicated debt portfolio and capital investment problems, Donkor uses Palisade software @RISK and RISKOptimizer. His research has led to new and better methods for addressing financial statistical problems, using @RISK.

Dr. Donkor used Palisade software tools @RISK and RISKOptimizer to conduct two separate research projects published in the journal the Engineering Economist. The first tackled the improvement of debt portfolios for financing capital investment plans, while the other project empirically tested stochastic dominance relationships in risky capital investments.

Diversifying Debt

In his first project published in the Engineering Economist, Donkor, along with Associate Professor Michael Duffey, addressed the question of how project promoters, who wish to use project finance as a procurement vehicle, can choose an optimal debt portfolio. The aim for these promoters is to develop a portfolio that maximizes project value but minimizes default risk when project cash flows are uncertain, and when debt with different financing terms can be sourced from multiple sources. For example, the promoter may face the previously-mentioned situation of multiple sources of debt with different financing terms—such as a ten-year loan with 5% interest, and a 5-year loan with 10% interest. “It is a difficult problem because the environment is fraught with uncertainty,” says Dr. Donkor.

To address this decision problem, Dr. Donkor and Dr. Duffey used @RISK and RISKOptimizer to help analyze and then recommend an appropriate mix of different debt instruments for financing a capital investment project. “Palisade’s RISKOptimizer allows the analyst to incorporate what’s known as probabilistic or chance constraints—this ensures that the risk of defaulting on loan payments, in any given period of the loan tenure is limited to say 5%,” says Dr. Donkor. They first developed a stochastic financial model in Excel, and used RISKOptimizer’s simulation optimization capability to select an optimal mix of fixed-rate debt instruments such that default occurred no more than 5% of the time. They then used @RISK simulation to evaluate the performance of the debt policy prescribed by the optimization model.

This new approach improves on the traditional methods used in risk analysis in capital investment planning, in which analysts would “take the value of debt as given, and simulate it without taking into consideration the probability of default on debt service each year,” says Dr. Donkor. “Furthermore, analysts do not consider multiple sources of debt with different financing terms—you hardly ever see debt portfolios with different terms and interest rates—it’s usually homogenous, with debt having one tenure and one interest rate.” Dr. Donkor and Dr. Duffey’s new approach, which shows how to model chance constraints in a spreadsheet environment, and implement it with Palisade’s RISKOptimizer, has sparked interest in the field. The paper detailing their work has become one of the highest ranked articles in the Engineering Economist.

Figure 1: The profile of excess earnings after debt service, indicating that the chances of defaulting on debt service are well contained.

Figure 2: Plot of excess earnings after interest coverage.

"Palisade’s RISKOptimizer allows the analyst to incorporate what’s known as probabilistic or chance constraints. This ensures that the risk of defaulting on loan payments, in any given period of the loan tenure is limited."Dr. Emmanuel Donkor
School of Engineering and Applied Sciences (SEAS), George Washington University

Ranking Opportunities

Dr. Donkor also addressed the problem that analysts face when they must choose between multiple, risky, mutually exclusive capital investments. He did this by creating a spreadsheet framework that uses Palisade’s @RISK to implement empirical tests of stochastic dominance—a term used in decision theory which describes the ranking of random prospects based on preferences regarding outcomes. As a result of Dr. Donkor’s work, analysts involved in comparing risky capital investments do not have to rely on qualitative and visually-based ‘best guesses’.

This solution benefits those who are faced with investment choices in which only one option can be selected. For example, a person owns a building, and has to decide whether to rent it out as residential apartments or as a factory. “You can’t do both at the same time,” says Dr. Donkor, “so you have to choose one option.”

Typically these kinds of opportunities are compared by using decision rules based on the mean-variance criterion (selecting portfolios based on the means and variances of their returns) or safety-first criterion (setting a minimum required return for a given level of risk). However, at times, the mean variance criterion and its variants result in an efficiency frontier in which more than one investment option offers maximal expected return for some given level of risk, and minimal risk for some given level of expected return. This can make it difficult to select only one option.

“The problem becomes complicated when you have opportunity A, which gives you the highest value, but it has a high risk, and opportunity B, which will give you lower value but a lower risk,” says Donkor. “As a decision maker, you want high value and low risk, but these qualities are not always enshrined in the same opportunity.” For such problems, stochastic dominance rules, typically implemented by visually inspecting the cumulative distribution functions (CDF) of the alternatives, are applied. However, for many practical applications, it is common for the distributions to cross tails, creating what’s known as the ‘tail problem’. In these circumstances, analysts apply what’s known as ‘almost stochastic dominance’ (ASD), which allows decision makers to ignore the crossing at the tails so that dominance inferences can be made.

These approaches are inexact and lack quantitative certainty; on top of these issues, Dr. Donkor says that most analysts do capital planning problems in Microsoft Excel, but are not able to make stochastic dominance inferences in that program. “A theory has been developed, but no one has turned that theory into a code in Excel where it can be used,” says Dr. Donkor. Thus, the majority of practitioners, researchers, and students who analyze alternative capital investment plans under uncertainty in Excel are limited to using either visual inspection or ASD without any empirical support.

Dr. Donkor has improved this process with his @RISK-enabled spreadsheet framework which empirically tests stochastic dominance. Now, instead of using a visual best guess, analysts can use an @RISK model to empirically test for the best option among many, allowing them to make empirically defensible decisions when comparing risky capital investments.