The Federal Home Loan Bank of Indianapolis (FHLBI, a U.S. government-sponsored bank) is a regional wholesale bank that has serviced approximately 400 Indiana and Michigan financial institution since 1932. Its core mission is to provide a reliable source of liquidity to member banks, credit unions, community development financial institutions, and insurance companies to support housing finance, asset-liability management, and community lending.

### Problem

In serving other financial institutions, the FHLBI must quantify credit risk in loan portfolios where collateral is held to mitigate potential losses, and quantify this risk relative to market/interest rate risk. "In general, default risk can increase quickly and with few warning signals," says Brendan McGrath, Director of Credit Risk Analysis in the Enterprise Risk Management department at FHLBI. "Default risks of similar entities, such as banks and other financial institutions, can also be highly correlated as they are exposed to similar assets and macroeconomic risks." To add to the complexity, many of these borrowers are also lenders themselves and post similar types of collateral. Thus, FHLBI faced a considerable challenge in modelling the expected losses from such portfolios.

### Modeling using Monte Carlo Simulation, along with Correlations, Sensitivity Analysis, and Optimization

To deal with this dilemma, McGrath turned to @RISK to run a Monte Carlo simulation that would not only provide a quantified loss distribution, but also give insights into the drivers of those risks.

The basic credit model is the Probability of Default x Loss Given Default x Dollar Exposure at Default (PD x LGD x EAD) or “PD model” for short. “What @RISK does is allow you to put that framework into a Monte Carlo simulation and account for the effects of default and collateral value correlation,” explains McGrath. Monte Carlo simulations are essential for portfolio models, as they can take into account correlations and quantify the actual impact of a loss, not just calculate a theoretical ‘average.’ “For example, if I lend $100 to someone that has a 5% chance of defaulting, my ’expected’ loss is $5, but in reality (assuming no recovery) I either lose $100 or nothing. So a distribution of projected losses tells you much more than an average or expected loss number” says McGrath.

“Correlation allows you to account for when there are two defaults at the same time which are not necessarily random,” says McGrath. “@RISK also allows you model relationships among collateral types. For instance, Treasuries tend to go up and down together, while the value of interest-only and principle-only (IO and PO) bonds generally move opposite of each other. Without correlations, these relationships are ignored.”

In his modeling, McGrath used Bernoulli trials where the probability of a 1 equals the probability of a default within a one-year time horizon. The Bernoulli trials are then correlated using default or asset correlations (depending on the default model) to simulate the probability of a joint default between two counterparties within a portfolio.

In his models, McGrath set gross dollar loss to be equivalent to dollar exposure at default (EAD). Gross losses in the portfolio are netted against the value of collateral held, the value of which is also simulated over a certain liquidation timeline.

“Since we accept a variety of collateral types, we fit the return distributions and correlation of collateral within @RISK using historical data,” McGrath explains. “Where data doesn’t exist, it is pegged against a proxy index or some other similar measure.”

The basic structure of the model is: if you have N borrowers, you will have N individual default distributions (Bernoulli) linked to an NxN correlation matrix. Then if you take Y different collateral types, you will have Y distributions of returns, nearly all fitted using historical return data, also correlated via a YxY correlation matrix (also fitted using @RISK). The defaults and the liquidation value of any collateral posted is then simulated via Monte Carlo simulation, which generates the loss distribution, VaR metrics, and other related data.

One limitation in McGrath’s model is that while defaults are explicitly correlated and collateral values are explicitly correlated, the correlation between a default and a change in the market value of a defaulted entity’s collateral is not explicitly modeled. “This is a common limitation and is usually approximated via stress-testing or stress-scenarios within the model itself,” McGrath explains. “The upside is that the @RISK framework gives you robust tools to do sensitivity analysis and optimization, so our model gives us the ability to do a variety of stress testing and what-if analysis on the portfolio.”

### Several Key Outputs Realized through Modeling with @RISK

After McGrath used @RISK to examine the potential for defaults, joint defaults, change in collateral value, and collateral value correlations, the powerful Monte Carlo simulation software was able to give him several key outputs, including:

- Loss distributions, which in turn allows you to calculate a Value-at-Risk or VaR number
- Individual and group drivers of default risk
- Collateral value distribution
- Gross and Net (of recoveries) loss distributions
- Tornado graphs, which allows the ability to do sensitivity analysis

For McGrath, the key benefit of @RISK is "the ability to model correlations as well as fitting distributions and correlations. We can build a framework for a credit portfolio model, and @RISK allows us to do this simply and effectively."