Many California produce farm operations use a rule-of-thumb to determine a hedge ratio for their seasonal productions. They often aim to contract 80% of their crop in advance to buyers at set prices, leaving the remaining 20% to be sold at spot prices in the open market. The rationale for this is based on many years of experience that indicates costs and a reasonable margin can be covered with 80% of production hedged by forward contracts. The hope is the remaining 20% of production will attract high prices in favorable spot markets, leading to substantial profits on sales. Of course, it is understood spot prices might not be favorable, in which case any losses could be absorbed by the forward sales.
Since the Recession of 2008, agricultural lenders and government regulators have recognized that many farm operators need to manage the risks to their margins, and free cash flows, rather than simply focusing revenue risks. A more quantitative analysis is needed to determine risks in the agricultural industry.
Agribusiness experts from Cal Poly conducted a risk management analysis using @RISK, and found the 80% hedge ratio rule-of-thumb is not as effective as assumed. Growers do not profit from spot market sales over the long run. The analysis shows growers are better off in the long-term selling as much of their product as possible using forward contracts.
Agriculture in California is big business. In 2013, nearly 80,000 farms and ranches produced over 400 commodities – the most valuable being dairy, almonds, grapes, cattle, and strawberries – worth $46.4 billion. Almost half of this value came from exports. The state grows nearly half of the fruits, nuts, and vegetables consumed in the United States. Yet agriculture is traditionally one of the highest risk economic activities.
Steven Slezak, a Lecturer in the Agribusiness Department at Cal Poly, and Dr. Jay Noel, the former Agribusiness Department Chair, conducted a case study on an iceberg lettuce producer that uses the rule-of-thumb approach to manage production and financial risks. The idea was to evaluate the traditional rule-of-thumb method and compare it to a more conservative hedging strategy.
The grower uses what is known as a ‘hedge’ to lock in a sales price per unit for a large portion of its annual production. The hedge consists of a series of forward contracts between the grower and private buyers which set in advance a fixed price per unit. Generally, the grower tries to contract up to 80% of production each year, which stabilizes the grower’s revenue stream and covers production costs, with a small margin built in.
The remaining 20% is sold upon harvest in the ‘spot market’ – the open market where prices fluctuate every day, and iceberg lettuce can sell at any price. The grower holds some production back for spot market sales, which are seen as an opportunity to make large profits. “The thinking is, when spot market prices are high, the grower can more than make up for any losses that might occur in years when spot prices are low,” says Slezak. “We wanted to see if this is a reasonable assumption. We wanted to know if the 80% hedge actually covers costs over the long-term and if there are really profits in the spot market sales. We wanted to know if the return on the speculation was worth the risk. We found the answer is ‘No’.”
This is important because growers often rely on short-term borrowing to cover operational costs each year. If free cash flows dry up because of operational losses, growers become credit risks, some cannot service their debt, agricultural lending portfolios suffer losses, and costs rise for everybody in the industry. Is it a sound strategy to swing for the fences in the expectation of gaining profits every now and then, or is it better to give up some of the upside to stabilize profits over time and to reduce the probability of default resulting from deficient cash flows?
Slezak and Noel turned to @RISK to determine an appropriate hedge ratio for the grower.
For inputs, they collected data on cultural and harvest costs. Cultural costs are the fixed costs “necessary to grow product on an acre of land,” such as seeds, fertilizer, herbicides, water, fuel etc., and tend to be more predictable. The researchers relied on the grower’s historical records and information from county ag commissioners for this data.
Harvest costs are much more variable, and are driven by each season’s yield. These costs include expenses for cooling, palletizing, and selling the produce. To gather data on harvest costs for the @RISK model, Slezak and Noel took the lettuce grower’s average costs over a period of years along with those of other producers in the area, and arrived at an average harvest cost per carton of iceberg lettuce. These costs were combined with overhead, rent, and interest costs to calculate the total cost per acre. Cost variability is dampened due to the fact that fixed costs are a significant proportion of total costs, on a per acre basis.
The next inputs were revenue, which are defined as yield per acre multiplied by the price of the commodity. Since cash prices vary, the grower’s maximum and minimum prices during the previous years were used to determine an average price per carton. Variance data were used to construct a distribution based on actual prices, not on a theoretical curve.
To model yield, the grower’s minimum and a maximum yields over the same period were used to determine an average. Again, variance data were used to construct a distribution based on actual yields.
Palisade StatTools was used to create these distribution parameters. @RISK was used to create a revenue distribution and inputs for the model. With cost and revenue simulation completed, the study could turn next to the hedge analysis.
Since the question in the study is about how best to manage margin risk – the probability that costs will exceed revenues – to the point where cash flows would be insufficient to service debt, it was necessary to compare various hedge ratios at different levels of debt to determine their long-term impact on margins. @RISK was used to simulate combinations of all costs and revenue inputs using different hedge ratios between 100% hedging and zero hedging. By comparing the results of these simulation in terms of their effect on margins, it was possible to determine the effectiveness of the 80% hedging rule of thumb and the value added by holding back 20% of production for spot market sales.
Unsurprisingly, with no hedge involved, and all iceberg lettuce being sold on the sport market, the simulation showed that costs often exceeded revenues. When the simulation hedged all production, avoiding spots sales completely, the costs rarely exceeded revenues. Under the 80% hedge scenario, revenues exceeded costs in most instances, but the probability of losses significant enough to result in cash flows insufficient to service debt was uncomfortably high.
It was also discovered that the 20% of production held back for the purpose of capturing high profits in strong markets generally resulted in reduced margins. Only in about 1% of the simulations did the spot sales cover costs, and even then the resulting profits were less than $50 per acre. Losses due to this speculation could be as large as $850 per acre. A hedging strategy designed to yield home runs instead resulted in a loss-to-gain ratio of 17:1 on the unhedged portion of production.
Slezak and his colleagues reach out to the agribusiness industry in California and throughout the Pacific Northwest to educate them on the importance of margin management in an increasingly volatile agricultural environment. “We’re trying to show the industry it’s better to manage both revenues and costs, rather than emphasizing maximizing revenue,” he says. “While growers have to give up some of the upside, it turns out the downside is much larger, and there is much more of a chance they’ll be able to stay in business.”
In other words, the cost-benefit analysis does not support the use of the 80% hedged rule-of-thumb. It’s not a bad rule, but it’s not an optimal hedge ratio.
Early @RISK Adopter
Professor Slezak is a long-time user of Palisade products, having discovered them in graduate school. In 1996, “a finance professor brought the software in one day and said, ‘if you learn this stuff you’re going to make a lot of money,’ so I tried it out and found it to be a very useful tool,” he says. Professor Slezak has used @RISK to perform economic and financial analysis on a wide range of problems in industries as diverse as agribusiness, energy, investment management, banking, interest rate forecasting, education, and in health care.
Infectious disease is an important cause of lost production and profits to beef cow-calf producers each year. Beef producers commonly import new animals into their herds, but often do not properly apply biosecurity tools to economically decrease risk of disease introduction. Dr. Michael Sanderson, a professor of Beef Production and Epidemiology at Kansas State University’s (KSU) College of Veterinary Medicine, wanted to address this issue by developing a risk management tool for veterinarians and beef cow-calf producers to assist in identifying biologically and economically valuable biosecurity practices, using @RISK.
The college was established in 1905, and has granted more than 5,000 Doctor of Veterinary Medicine degrees. Departments within the College of Veterinary Medicine include anatomy and physiology, clinical sciences, diagnostic medicine, and pathobiology. The college's nationally recognized instructional and research programs provide the highest standards of professional education. A rich, varied, and extensive livestock industry in the region, a city with many pets and a zoo, and referrals from surrounding states provide a wealth of clinical material for professional education in veterinary medicine.
Reproductive disease is an important cause of lost production and economic return to beef cow-calf producers, causing estimated losses of $400 to $500 million dollars per year. Because of the complex nature of the production system, the biologically and economically optimal interventions to control disease risk are not always clear. Dr. Sanderson and his team (including Drs. Rebecca Smith and Rodney Jones) utilized @RISK to model the probability and economic costs of disease introduction and the cost and effectiveness of management strategies to decrease that risk.
“For this project, @RISK was essential to model variability and uncertainty in risk for disease introduction and impact following introduction, as well as variability and uncertainty in effectiveness of mitigation strategies,” said Dr. Sanderson. “Further, @RISK was crucial for sensitivity analysis of the most influential inputs to refine the model and to identify the most important management practices to control risk. It was also valuable to aggregate results into probability distributions for risk and economic cost over one-year and ten-year year planning periods.“
The project modelled the risk of introduction of the infectious disease Bovine Viral Diarrhea (BVD) into the herd, the impact of disease on the herd (morbidity, mortality, abortion, culling, lost weight) and economic control costs. These risks were aggregated over ten years to identify the optimal management strategy to minimize cost from BVD accounting for both production costs and control costs.
Probability distributions included:
Target probabilities were utilized to produce the probability of exceeding a certain cost over one and ten years, provide this data as a single number for each management option and generate descending cumulative probability distributions for exceeding any particular cost value.
As a result of the risk identification insight gained from the research, Dr. Sanderson and his team were able to improve disease management and controls by identifying:
“Our utilization of @RISK gave us the ability to account for complex aggregation of inputs and their variability and uncertainty to produce full-outcome probability distributions for more informed decision making,” said Dr. Sanderson. “Further, the ability to use research data from multiple parts of the beef production system and combine those results into a model that accounts for the complexity of the production systems allows recognition of emergent phenomena and decision making based on the full system, rather than only one part. The flexibility to customize outputs provided the most valuable information for decision making.”
Palisade’s risk analysis software @RISK is being used by aquatic veterinary surgeons to demonstrate the practice of biosecurity to aquatic farmers. The method helps to reduce the potential for disease in animals without incurring the significant costs of extensive testing. Only a small number of data inputs are required for @RISK, with thousands of simulations then presenting accurate results that inform decision-making.
It is estimated that the human population will be nine billion by 2030. The Food and Agriculture Organization (FAO) believes that aquaculture, which currently provides around half of the fish and shellfish eaten around the world, is the only agricultural industry with the potential to meet the protein requirements of this population. However, one of the biggest constraints to achieving this is the depletion of stock levels through disease. In 1997, the World Bank estimated that annual losses amounted to $3 billion, and current figures suggest that 40 percent of insured losses are due to disease.
Biosecurity measures, which aim to prevent, control and ideally eradicate disease are regarded as essential. However, encouraging the adoption of these practices is often difficult due to the farmers’ levels of education, training, responsibility and perceived economic benefits. In addition, global estimates of disease losses may appear remote and irrelevant to farmers and producers faced with making a rational choice from scarce data and, often scarcer, resources.
Dr Chris Walster is a qualified veterinary surgeon with a long-standing interest in aquatic veterinary medicine, and is the secretary to the World Aquatic Veterinary Medical Association (WAVMA). Having seen Palisade’s risk analysis tool, @RISK, demonstrated, he started using it to calculate the realistic risk of aquatic disease to farms, with a focus on cases where data inputs were limited.
@RISK’s capacity to present the calculations in graphs that are easy to understand also makes it straightforward for vets to show farmers disease risk probabilities. With this information readily available, the cost/benefit of disease prevention can be calculated, and farmers can make informed choices about whether to put controls in place.
For example, a farmer might plan to import 1000 fish to their farm. The cost to accurately determine the disease status of these fish may be uneconomic, but testing a small sample will not give sufficient evidence on which to base an informed purchase decision.
However, testing 30 of the fish and running further simulations using @RISK will give the probability of how many fish might be diseased if more were tested. In other words it provides the farmer with a more accurate picture of the risk connected to purchasing the stock.
If there is no information as to whether the fish carry a disease of interest, testing 30 of them would be expected to return the results that 15 are diseased and 15 are not (a disease prevalence of 0.5 must be assumed, giving a 50/50 probability). However, because tests are rarely 100% accurate, when interpreting a test result, its validity, or how well it performs must also be accounted for. This requires knowing the test characteristics, sensitivity (test positive and truly disease positive) and specificity (test negative and truly disease negative) along with the disease prevalence (or likelihood).
Introducing a sensitivity of 80% for example reduces the fish testing positive to twelve (15 x 0.8). In this case, using a specificity of 98% the simulation is run 10,000 times to produce enough ‘values’, and these are used to produce a graph showing likely minimum, maximum and mean prevalence of the disease.
This simple example helps to generate understanding amongst farmers that they do not need to undertake extensive testing programs to obtain accurate results about disease levels in fish.
Further evidence can be gathered by running more tests that supplement the @RISK distribution graphs with prior knowledge – facts that are already known and accepted. For example, international regulations make it illegal to transport sick animals. Therefore, if a particular disease shows obvious symptoms, it seems reasonable to assume (using human expertise) that the prevalence of the disease is no higher than 10%, or the seller would have noticed that the fish were sick and could not be sold. Once again, only 30 fish are tested, but this time @RISK is used for a PERT distribution with expert opinion introducing a minimum of 1%, most likely of 5% and maximum of 10%. Running the @RISK simulation 10,000 times again to produce significantly more values can change the results significantly.
With this knowledge, the farmer can now decide on the next course of action. They may decide they are happy with the potential risk and buy the fish. Equally they may want more certainty and therefore test more fish or use additional tests. Finally they may feel that the risk is too great and research other sources.
“@RISK enables farmers to reduce the risk of disease spreading amongst their animals whilst minimizing additional costs,” Walster explains: “For aquatic vets, the key is the graphs which allow us to demonstrate a complex probability problem quickly and simply in a way that is easy to understand and trust. These inform decision-making, thereby helping to boost the world’s aquatic stock whilst safeguarding farmers’ livelihoods.”
“This technique also potentially offers an economical method of assisting in the control of many diseases. Farmers undertake their own tests, with each of these providing incremental inputs so that the macro picture can be developed and acted upon,” concludes Walster.
The Listeria monocytogenes bacteria (L. monocytogenes) is a very severe bacteria that is a major cause of food poisoning, potentially leading to premature birth miscarriages and even cases of meningitis and other serious health problems. It is reported to occur mainly in unpasteurized natural cheese, meats, vegetables and fruits. In France reported cases of the L. monocytogenes found in pasteurized liquid eggs have made it essential to better understand just what the risk is. Kewpie, one of Japan’s largest food ingredient manufacturers and a major egg product producer, teamed up with the University of Tokyo’s Graduate School of Agricultural and Life Sciences Research Center for Food Safety to develop a growth model of bacteria in liquid eggs in order to increase their understanding of the risks.
From its inception, Kewpie has followed the spirit of “Good products are only made from good ingredients,” and that food safety is something that must always be strived for to the best of their abilities, as explained by Miho Okochi, who is part of Kewpie Corporation’s R&D Center Food Safety Division Microbial Laboratory. “In our laboratory we focus on technology and research of microorganisms necessary in product development, and microorganisms that may contaminate products,” he explained.
Since contamination growth research for chicken eggs did not previously exist, the lab first conducted research on the L. monocytogenes contamination rate in unpasteurized liquid eggs. The results showed a lower rate of contamination and lower bacteria counts compared to other livestock meat products. This led them to believe that L. monocytogenes could be sterilized enough in the normal pasteurization process of liquid eggs. However, the French reports showed the L. monocytogenes present in liquid eggs even after the sterilization process, demonstrating there is definitely a chance of L. monocytogenes surviving and contaminating eggs, even after pasteurization.
“Because of this we decided to research the growth rate of L. monocytogenes in liquid eggs. We built a growth model based on L. monocytogenes counts in varying storage temperatures and times in liquid eggs so that we could understand under what conditions the risk of L. monocytogenes growth would increase,” explained Okochi.
“I had always wanted to create a growth model of food bacteria, but this was a bit much for one researcher to tackle. Growth modeling in the risk analysis of food microbiology is an important tool – and the research results are important for the entire industry,” he said. For help with the task, Okochi turned to Tokyo University’s Research Center for Food Safety, one of the few universities working in this field, to establish a joint research project.
It is normal that the speed and rate of bacteria growth will vary, even when applying the same bacteria to the same food sample. The growth parameters must be well understood if researchers are to make use of data in actual food safety. The challenge is that it is difficult to produce enough bacteria to accurately find average values, standard deviations and other statistically significantly values, but by using Palisade’s @RISK software, researchers can use a Monte Carlo simulation of the growth model coefficient to estimate the growth parameters. @RISK, an add-in to Microsoft Excel, uses Monte Carlo simulation to examine thousands of different possible scenarios in any quantitative model.
Okochi explained how they learned about @RISK: “We heard about the software from a university professor who was researching food safety. It is used in the risk assessment of food microbiology and becoming more widely used in Japan.” @RISK has a number of benefits, he said. It has been adopted in many industries, it uses industry standard Microsoft Excel, and the spreadsheet files can even be opened by computers without @RISK installed. In the food safety industry there are a number of users, and the availability of training in Japanese from Palisade is a great benefit. The @RISK software is also available in Japanese.
For this research, the experimental kinetic data were fitted to the Baranyi model, and growth parameters, such as maximum specific growth rate (µ(max)), maximum population density (N(max)), and lag time (?), were estimated using a Monte Carlo simulation with @RISK. As a result of estimating these parameters, the researchers found that L. monocytogenes can grow without spoilage below 12.2°C, so they then focused on storage temperatures below 12.2°C. Predictive simulations under both constant and fluctuating temperature conditions demonstrated a high accuracy, and with this joint research model they were able to predict growth of L. monocytogenes in pasteurized liquid eggs under refrigeration. The results of this research provide crucial information to the food safety industry.
According to the researchers, the advantages of @RISK include the capability to run simulations even when the case is mathematically difficult. It is also easy to understand the simulation results, since researchers can visually verify the results using charts and graphs, making it extremely powerful and useful.
Professor Katsuaki Sugiura, professor and researcher at the Laboratory of Global Animal Resource Science at the Graduate School of Agriculture and Life Sciences, part of the University of Tokyo, praised the joint research as well as @RISK.
“Through Mr. Okochi’s research, we have been able to attain valuable results in a research field with little precedence, based on the development of a growth model of the L. monocytogenes bacteria in liquid eggs. I believe that @RISK will play a valuable role in the field of microbiology forecasting. As @RISK is used in fields besides food safety, from 2013 I have used it as one of our information processing tools with my students for risk analysis in other fields. The real benefit of @RISK is that it is an add-in for Microsoft Excel, so you do not need any complex programming language. I look forward to seeing more and more research results in various fields.”
With the advent of corporate farming, large agricultural companies have begun to apply the same kinds of risk analysis techniques to the powerful uncertainties of the natural world and market prices that their counterparts in the manufacturing and finance sectors use. And Prof. Ariadna Berger, professor of Farm Management at the University of Buenos Aires, has gone so far as to introduce a portfolio approach to assessing and balancing the risks and opportunities of large cash cropping operations in Argentina. She began using @RISK when she was a graduate student at Cornell University and reports that it is the perfect tool to show her clients how to manage the risks in their large-scale farming operations.
In her consulting for farming corporations, Prof. Berger creates portfolios analogous to the portfolios of stocks and bonds used by financial analysts. Just like investment portfolios, the idea is to spread risk via diversification, except that in agricultural portfolios she is creating a mix of climate regions, soils, crops, and cultivation practices. By planting different crops, both market and yield risks are reduced; by planting in different regions and with different cultivation practices, yield risk is reduced even more. Like so many of her counterparts in finance, Prof. Berger uses @RISK to simulate risks and rewards throughout her clients’ potential agricultural “holdings” and to help her clients compare those possible outcomes with potential results from other possible portfolios with different diversification schemes.
Farming operations are conditioned by extreme fluctuations of weather and market price. Fortunately, Argentina has a number of climate zones, and farmers can choose from a number of crops to plant: mainly wheat, soybeans, corn or sunflowers. This situation allows farmers to spread the risk strategically.
According to Prof. Berger, the big challenge in creating her simulations is that, due to continually changing agricultural practices, it has been difficult to get data based on similar practices and technology over a long enough period of time to generate distributions. The way she compensates for the lack of historical data is by using agronomic simulation models to generate yield data to enter as @RISK distributions. These models are based on approximately 30 years of data on soil, water, nutrients, plant variety, and planting methods, and generate simulated yields that can be used to create a distribution. However, agronomic simulation models predict yields based only on restrictions for water and nutrients, while in the field there are other factors reducing yields, such as hail, frost, pests or diseases. Prof. Berger accommodates this limitation by complementing the yield data generated by the agronomic simulation models with other distributions for weather and weather-related random variables, which are based on historical data. In this way, yields simulated in each @RISK iteration are a combination of the distribution generated by the agronomic simulation models and of other random variables that may affect yield.
Finally, she rolls land and crop costs into this complex mix, and @RISK suggests answers to such questions as: How much land to assign to each crop? How much land to lease? And how many climate zones to include in the portfolio? This information is crucial to her clients because farming has always been a chancy business, and large-scale farming involves large-scale downturns and upswings with large-scale investments at stake. Investment in crop costs and land rent depends on crops and regions, but on average it can be estimated around US $125 per hectare. Some portfolios can be as big as 100,000 hectares, and more.
Because of the unpredictable nature of agriculture, serious planning can only happen if yield and market risks are considered. “Decisions made exclusively on the grounds of expected returns can be misleading and generate an unbearable risk exposure,” says Prof. Berger. “By accounting for a nearly limitless variety of uncertainties in our work, @RISK adds tremendous value. It easily integrates in the Excel programs with which we evaluate portfolios and helps us to make far better decisions on how to structure our agricultural portfolios.”
The US Government spent $24 billion on farm programs in 2000. The benefits were distributed through a variety of income-enhancing and risk-reducing policies. For instance, some programs provided price protection, others a direct subsidy, and others subsidized the purchase of crop insurance.
How do these programs individually and collectively reduce agricultural risk? Are these programs having the desired effects? Is the money helping producers reduce risk and thereby providing a disincentive to purchase crop insurance? And are there better ways to address agricultural risk?
Researchers at Cornell University and Purdue University recently completed a study that assesses the impacts of US farm programs on farm returns and answers some of the above questions. The researchers used Palisade’s @RISK to create a simulation model that illustrates the effects of government policy tools on farm incomes.
To assess the impacts of government policy tools on farmland risk and return the researchers developed an economic model for farm income and expenses based on a representative parcel of land and crop mix. @RISK allowed the researchers to model the uncertainties associated with crop yield and price. After running base-line simulations, the researchers added the individual farm programs into the model to determine their impacts. Finally they combined all the programs and crop insurance into the model. They compared the simulation outcomes to determine the impact of the various payment/subsidy mechanisms.
The @RISK simulation model demonstrated that a combination of all government programs would raise average farm incomes by almost 45%. Additionally, the programs would reduce the economic risks associated with farming by half. Most importantly, the model allowed researchers to examine how the programs interact with one another to alter the return distribution.
Producers must make decisions regarding cash rental bids, crop mixes, and even farmland purchases based upon expected returns. These decisions are based on the expectations for market prices, yields and costs – all uncertain elements.
Assistant Professor Brent Gloy of Cornell University’s Applied Economics and Management Department was one of the researchers. “@RISK was vital to the simulation model. It allowed us to incorporate uncertainties and run random simulations on the various scenarios.” He adds, “@RISK’s ability to correlate distributions of random variables was essential to the model. Additionally, we used @RISK’s output statistics to compare the various model scenarios.”
The study quantifies how government programs impact each other and subsidized crop insurance. According to Dr. Gloy, “Our results indicate that the risk reduction provided by the standard programs significantly reduce the value that risk averse producers derive from crop insurance programs.” He adds, “@RISK was instrumental to the simulation model. It allowed us to incorporate uncertainties and correlations, and to systematically evaluate each of the farm policy tools.”
The study was recently published in the Review of Agricultural Economics. For more information about the study, contact Brent Gloy at 607-255-9822 or firstname.lastname@example.org.
Integrated Multi-Trophic Aquaculture (IMTA) refers to managed, multi-species systems which recycle the byproducts of one aquatic species as nutritional inputs for another. These systems aim to improve environmental management, and increase harvest value through product diversification and recycling of nutrients. However, for these benefits to be realized, all the interacting components must be understood and carefully optimized. Dr. Gregor Reid, a senior research scientist with the University of New Brunswick, has used @RISK to determine the ideal ratios of kelps to absorb fish waste matter from a salmon IMTA system, as a means to better help aquaculturists improve the efficiency of seafood farming operations. A paper detailing this work appeared in the journal Aquaculture in May of 2013.
Aquaculture has become a booming industry in many regions of the world. According to the World Bank, nearly two-thirds of the seafood we consume will be farm-raised in 2030. However, as with all forms of intensive food production, commercial aquaculture is not immune to the potential for environmental impacts. Modern intensive fish farms produce large amounts of nutrient waste, both as solid-organic (feces and waste feed), and dissolved (carbon dioxide, ammonia and phosphate). If a farm is poorly located with limited flushing, or nutrient loading exceeds environmental assimilative capacity, algal blooms (i.e. eutrophication) and low oxygen may result, impacting the localized environment. Since these nutrient forms are the same as natural inputs for shellfish that filter organic particulates, and for kelps that absorb soluble inorganic nutrients, this provides an ideal opportunity to mitigate fish waste, while augmenting nutritional inputs for other marketable culture species.
While the principle of growing multiple aquatic species in close contact has been done informally for thousands of years, “We’re now on a much bigger scale,” says Reid. “We’re now trying to apply the historical benefits of polyculture [the simultaneous cultivation of different species] to the typical large monocultures seen in aquaculture today, by connecting adjacent culture groups by nutrient transfer through water.”
The system Reid works on, located in the Quoddy region of the Bay of Fundy, Canada, involves farmed Atlantic salmon and two different species of kelp: winged kelp (Alaria esculena) and sugar kelp (Sacharina latissimi). Salmon are a valuable seafood species, while kelps are used in a broad array of consumer products. If grown in close proximity and in the right ratios, these two species can benefit each other, along with the environment. This extra nutrient availability can increase kelp growth, resulting in a more profitable harvest. In turn, with enough kelp, oxygen levels can be boosted and nutrient absorption can reduce the impact of fish waste on the surrounding aquatic environment.
However, finding the appropriate ratios of salmon to kelp to achieve these nutrient transfer benefits is tricky, thereby making the benefits of large-scale commercial IMTA operations difficult to execute. While researchers have previously attempted to determine appropriate ratios, “past recommendations have been in algae per square meter, which doesn’t really mean much to someone without a lot of experience in this field,” says Reid. His goal was to create a straightforward measure easily understandable for application to an IMTA operation. To do this, he created a ratio model that reports the weight ratio of harvested seaweeds required to sequester an equivalent soluble nutrient load per unit growth of fish. More specifically, for every x kg of salmon grown, y kg of kelp would need to be harvested, to remove the same amount of nutrients produced during the x kg growth of salmon.
Reid used a semi-stochastic approach for data that has high variability, such as nutrient loading estimates, in order to quantify uncertainty in the model.
Reid gathered inputs from three diverse sources: the commercial industry, academic literature, and field research. Rather than use single static values for the model inputs, Reid used stochastic distributions for several inputs with ranges of uncertainty, such as digestibility of salmon feed components (e.g. protein, fats, carbohydrates) and nutrient content of IMTA kelps. While known inputs or easily determined values, such as feed composition (listed on all feed bags) could be left as static inputs. Model inputs with sufficient empirical sample data were fit with the @RISK’s distribution fitting functions, testing 15 different theoretical distribution types, and using maximum likelihood estimators to rank the best fits.
Reid had to be careful when considering input data. “Often, you’re trying to fit a theoretical data distribution, but there are certain properties in a theoretical data distribution that won’t exist in real life,” he explains. “There are distributions that assume the tails go on forever, but that’s never going to happen. So you have to apply data filters on those to get more realistic results.”
Reid used @RISK to run a simulation of 10,000 iterations using these inputs. The results gave a weight ratio model, with ratio estimations for each key nutrient, as well as oxygen supply potential for both species of kelp (see graphs).
In order for a commercial-scale salmon farm to have all its fish waste fully absorbed by kelp the number of rafts of kelp would have to significantly outnumber the salmon pens, which is impractical for most North American operations where the space available for coastal fish-farming is highly limited and regulated. “It’s not going to be possible to do full dissolved nutrient recovery with seaweeds such as kelps in North America, unless the the number of fish being farmed on site is reduced to make room for other species,” says Reid. “It would be tough for commercial salmon-culture operators to do that, because presently there is a larger return for fish per unit space than other cultured species.”
However, Reid notes that “one-hundred percent nutrient sequestration doesn’t need to be the only successful endpoint in such systems. Removal of any nutrient portion has merit and this is presently occurring with kelp rafts deployed at the edges of some salmon farm lease areas.” He also says that smaller-scale farms may be in a better position to alter the balance of species groups, by raising a premium priced fish species, thereby enabling a financial return from smaller fish culture numbers.
Reid says that @RISK was instrumental in his study: “@RISK has been a great way to communicate uncertainty in complex systems, and its ease of use with Excel spreadsheets make it a highly intuitive and powerful tool.”
Professor Katsuaki Sugiura of the University of Tokyo uses Monte Carlo simulation in @RISK to improve the bovine spongiform encephalopathy (BSE) surveillance program in Japan, to improve food safety.
Professor Katsuaki Sugiura at the Laboratory of Global Animal Resource Science at the Graduate School of Agriculture and Life Sciences, the University of Tokyo, has used Palisade’s @RISK software since 1995 in his research activities. He has used the risk software to assess risk in the import of animals and livestock products and in food safety risk assessment. Many researchers in the specialized field of veterinary epidemiology also use @RISK, making it easy to utilize in joint research activities.
His current research is on bovine spongiform encephalopathy (BSE) – a progressive and fatal nervous disease found mainly in adult dairy cattle. The cause of BSE is oral exposure to what’s known as an abnormal prion protein. BSE is particularly worrisome because it is transmitted through meat-and-bone meal (MBM). MBM is derived from unwanted animal slaughter products and fallen stock that are cooked to give off water as steam. It is characterized by long incubation periods (2-8 years with 5 year average).
The first case of BSE in Japan was confirmed in September 2001, and a number of measures were taken to protect animal health as well as public health. One of these measures was the testing for BSE of all cattle slaughtered for human consumption from October 2001. From April 2004, all fallen-stock (cattle that died on farms or during transport) older than 24 months were also tested. As a result, through the end of 2012, 36 cows were diagnosed with BSE, from a total of 14 million heads of cattle slaughtered for human consumption and 910,000 fallen-stock carcasses tested. There are several diagnostic tests for BSE. The currently available diagnostic tests involve detection of the abnormal prion protein. Normal prion protein exists along with abnormal prion protein in the brainstems of BSE-infected cattle. The detection of abnormal prion protein is done by using proteinase-K, which digests normal prions but leaves the abnormal prions intact. But there are limits to this BSE diagnostic. At the end of the incubation period the abnormal prion proteins accumulate in the brain; in other words, even if a cow is infected it cannot be detected unless it is just before the onset of the disease. The test cannot detect the infected cattle that are slaughtered or have died from other causes before the end of the incubation period. Since the incubation period is long and varies between 2 and 8 years, the age of clinical onset is not fixed, and the age at which cattle may die or be slaughtered varies.
In Japan, all cattle butchered for human consumption are tested for BSE, as well as all cattle that died over 24 months old. However, due to the variability of the age of slaughter and death, and duration of incubation period and the limited detection capability of the diagnostic test, Professor Sugiura uses Monte Carlo simulation in @RISK to improve the surveillance program. He builds stochastic models that predict how changing the testing age of the cattle will impact the number of cattle tested and BSE infected cattle detected.
@RISK uses Monte Carlo simulation in Microsoft Excel to perform risk analysis, utilizing mathematical techniques to help better understand risks through quantitative analysis and to improve decision-making. The software can calculate the probability and impact of various possible consequences. Users can then grasp the associated risk and probability for each scenario, by tracking and objectively calculating different potential scenarios based on the formulas.
The thinking behind BSE testing model was as follows:
Four surveillance strategies were explored for cattle slaughtered for human consumption, with the minimum age at testing set at 0, 21, 31, or 41 months. Three surveillance strategies were explored for fallen stock, with the minimum age at testing set at 24, 31, or 41 months. Increasing the minimum age of testing from 0 to 21 months for both dairy cattle and Wagyu beef cattle had very little impact on the probability that a BSE-infected animal slaughtered for human consumption would be detected. Although increasing the minimum age at testing from 21 to 31 or 41 months would lead to fewer slaughtered animals being tested, the impact on the probability of detecting infected animals would be insignificant. The probability of infected Wagyu-Holstein crosses and Holstein steers being detected at slaughter or as fallen stock would be very low under all surveillance strategies.
Professor Sugiura said about @RISK, “The key point is that without having to learn any new programming language we are able to construct models right in Microsoft Excel and process them visually.” The insights provided by @RISK in Professor Sugiura’s work enables researchers to eliminate testing age as an important factor so they can focus on other, more effective factors.
The Shanghai FDA has been using @RISK for monitoring contaminants as well as food safety risk assessment, focusing on food service and the circulation of foodstuffs within the city. As a part of the civil administration, analyzing and regulating potential risks, especially in the food supply, are top concerns that could potentially affect over 20 million residents.
The Shanghai Food and Drug Administration is the department responsible for monitoring the safety of the production, circulation and consumption of food, drugs, health supplements and cosmetics for the City of Shanghai. As a part of the civil administration, analyzing and regulating potential risks, especially in the food supply, are top concerns that could potentially affect over 20 million residents.
The Shanghai FDA has been using Palisade’s @RISK software for risk and decision analysis since 2007 for monitoring contaminants as well as food safety risk assessment, focusing on food service and the circulation of foodstuffs within the city. Tian Mingsheng, Inspector and Chief Physician of the Shanghai FDA, explains how @RISK has benefited Shanghai’s food supply.
When the Shanghai FDA needs to carry out testing and analyses of poisonous or otherwise dangerous biological and chemical contaminants, exposure assessments must be carried out for both chemical and biological contamination. The large amounts of complex data generated in the process must undergo quantitative scientific analyses and comparisons using specialized software.
The Administration also uses dietary surveys to gather data on residents’ expenditure on various foods, which requires the use of powerful, specialized analysis software – such as @RISK. Since officially beginning the use of @RISK in 2007, the software has played an indispensible role in the efficient development of food safety monitoring at the Administration, helping guarantee the safety of the food Shanghai residents consume.
Before @RISK, the main challenge for the Shanghai FDA was its inability to carry out probability simulations in the monitoring and evaluation of food safety risks. The Administration relied on mean averages to calculate average risk exposure, a method that does not allow for accurate, quantitative exposure assessments for sensitive groups or highly exposed populations, leaving no way to describe the formation and development trends behind a risk. As a result, formulating effective countermeasures and strategies was difficult, as some highly exposed groups and people with high sensitivity to particular substances or pollutants were left out. It was difficult for the Administration to gain accurate information about how the health of these populations was affected or what the effects were, and effective supervision measures could not be formulated.
The Shanghai FDA took steps to counter the issue, but were unable to gain precise, broad data and conclusions. Thus, there was an urgent need for an effective risk evaluation and assessment software suite to carry out precise, broad probability assessments, produce accurate exposure assessment results and create complete late-stage modeling collections for use in scientific decision analysis.
The Monte Carlo method and probability assessments have been used by the Administration since the 90s. In 2005, use of the Monte Carlo method in China was still in the early stages. When the Shanghai FDA began risk monitoring in 2005, the first problem it faced was how to handle the large amounts of data being generated, match contaminants with real food contamination data and carry out probability assessments. In 2007, the WHO and FAO used @RISK software for constructing risk models in a number of risk assessment reports.
Consequentially, in 2007, the Shanghai FDA decided to take the lead in becoming the first organization in Mainland China to purchase @RISK 5.0, gaining excellent results with a nitrite contamination risk assessment for cooked meats, cadmium contamination exposure assessments, a vomitoxin risk assessment for wheat products and a risk assessment for bacillus cereus contamination in rice, completely overcoming previous setbacks.
After many achievements with @RISK, the Shanghai FDA decided to upgrade to DecisionTools Suite 5.7 after its release by Palisade in 2012, gaining more powerful functionality to continue moving forward with risk and decision analyses for microbial contamination. The Administration also carried out analyses on vibrio parahaemolyticus contamination in seafood and listeria contamination in refrigerated ready-to-eat meals with excellent results to make effective, needed risk management recommendations.
In 2008, the Shanghai Food and Drug Administration used @RISK to carry out exposure analysis on nitrite in cooked meat, a classic example of how @RISK can be used in risk and decision analysis. All eyes were on Shanghai as it prepared to host the 41st World Expo in May 2010, and the government was very concerned about maintaining food safety during the event. In 2008, the Shanghai FDA began carrying out data monitoring and quantitative analysis of nitrite contamination in cooked meat, making random checks on 370 meat products and finding 16 products that exceeded standards, a rate of approximately 4%. Researchers carried out curve fitting on the data, and found an exponential distribution for the contamination.
They then held a dietary survey to find the amounts of cooked meat products Shanghai residents ate per day and found a left-truncated normal distribution. On the basis of this initial data, they then used a beta distribution to show the probability of consuming nitrites in excess of standards in normal consumption habits. After that, the researchers used @RISK software to simulate the sample 10,000 times, multiplying the three variables to fit possible real-life situations, and thus gained an exposure dose of nitrite that could be ingested by a regular Shanghai resident by eating cooked meats.
The Shanghai FDA discovered that there was a possibility of passing the 0.3g/dose threshold for acute nitrite poisoning, as well as a possibility for exceeding the allowable daily intake (ADI). With the results of these exposure assessments, the Shanghai FDA was able to provide appropriate management countermeasures based on the conclusions by, for example, proposing that businesses in the food service industry be forbidden from using nitrite to avoid quantities exceeding standards in cooked meats due to improper use or other factors and high residual risk, thereby completely eliminating the possibility of nitrite poisoning at its root.
The Shanghai Food and Drug Administration is very satisfied with its collaboration with Palisade’s @RISK software for risk and decision analysis, finding its powerful, scientific risk analysis functionality, and especially the DecisionTools Suite, extremely valuable. You could say that, while risk assessment and problem-solving used to rely solely on the senses and subjective experiences, with @RISK, the Shanghai FDA has been able to eliminate its old-fashioned, backward risk assessment and evaluation methods, and find patterns even at the micro level, finding, for example, patterns and risks in chemical and biological contamination, with important implications for maintaining food safety for Shanghai residents.
The article shines light on the fact that the U.S. Food and Drug Administration has launched an interactive web-based tool called iRISK, to combat the seeming endless risks in the "farm-to-table" pathway. Like @RISK, this tool utilizes Monte Carlo simulation to analyze potential food contamination risk based on a number of factors: the food(s), the hazard(s), the population of concern (for instance, elderly or immune-compromised), the production or processing system being used for the food, the consumption patterns, the dose response (what level of exposure will have a health impact), and how the health effects are to be calculated.