Extreme Outliers - The Black Swan Problem

Explore the hidden risks in finance and learn how even elegant theories can lead to ruin.

Elegant finance theories, like the Black-Scholes-Merton Model or the Modern Portfolio Theory, may win you prestigious awards, but their application could well ruin you. The belief underlying these theories was at the heart of Long Term Capital Management's failure in 1998. You'd think we'd know better by now, but the same underlying belief played a key role in the 2008 Global Financial Crisis as well as the recent failure of Silicon Valley Bank in March 2023. What happened and why haven't we learnt from it? Read on to find out.

Flawed Assumption

We are judged not on the awareness of the limits of our knowledge, but on the eloquence with which we explain - and attempt to predict - events. No one ever won a Nobel Prize for admitting to not knowing something. People like to make sense of the world, and uncertainties are seen as inconveniences blocking their path to understanding. This temptation to construct stories to explain things - "or rationales", to be eloquent - is known as narrative fallacy. It often leads to overly simplified ideas that are potentially catastrophic for two reasons - 1) There are implicit assumptions about how the world works that get overlooked as these stories spread, and 2) They create an illusion of understanding and cause the gap between what 'we think we know' and what 'we actually know' to become dangerously wide.

One such story was what the founders of Long Term Capital Management (LTCM; a hedge fund) told themselves, their investors, and anyone who's ever studied finance. Their story, written by Nobel Laureates Myron Scholes and Robert Merton (yes, the namesakes of the model mentioned earlier), was better than anything I could ever write.

LTCM assembled what was arguably the 'smartest' group of people available; if this were a Marvel movie, their team would be the 'Avengers'. All signs pointed towards success. Initially, that's exactly what happened: LTCM's returns were the envy of Wall Street, with ~40% annualized returns in 1995 and 1996. Riding this wave of success, LTCM's founders and employees reinvested all their bonuses into the fund. Life was good, but then it wasn't. The models LTCM based their investments on were fundamentally flawed (they still are but continue to be used, go figure!).

The flaw, the same underlying belief I mentioned at the start of this essay, was an attempt to explain a complex phenomenon - in this case asset returns - by conveniently painting a picture of Gaussian ('normal') probability distributions.

Music Stops

Contrary to its name, a "normal" distribution is not always the norm in the real world; it mostly exists in theory. A Gaussian distribution implies that most observations are around the mean, and the odds of a deviation decline exponentially as you move away from the mean, thereby making the outliers increasingly unlikely to the point they can be safely ignored. Height, weight, and mortality rates are some examples that exhibit a Gaussian distribution, but not everything does. Consider this, if you know that two people have a combined height of 14ft you can safely predict both are around 7ft. But if I told you two people have a combined annual income of $2 million, it's quite likely one makes $100k and the other $1.9 million. Similar to income / wealth, financial markets have distributions with 'fat tails' of extreme events that are more likely to occur than in a Gaussian distribution. This is exactly where LTCM went wrong.

The bright minds at LTCM based their entire investment philosophy on the idea that asset prices largely hover around the expected mean (a metric they naturally calculated based on historical data), and that any small deviations would ultimately always be corrected. LTCM would bet on these deviations - spreads - narrowing until equilibrium was restored, and since these deviations were quite small and not too profitable on their own, they employed leverage (borrowed money; as high as 25x) to amplify returns. Said differently, they were picking up pennies in front of a steamroller.

LTCM concluded that a large deviation from the mean was not even remotely possible (given a Gaussian distribution), and assumed that in the unlikely event it did happen it would mean the markets were not being efficient (like they supposedly are) so a correction would be quick and merely a matter of time. Underlying these steadfast beliefs were models built using careful analysis of historical data, but the Nobel laureates misinterpreted 'no evidence of anomalies' as 'evidence of no anomalies'. We all make mistakes, but not all of us are 25x levered when we do so. The practical implication of a 25x leverage is that a mere 4% loss will wipe out one hundred percent of your equity. Let that sink in...

In August 1998, the music stopped. In a shocking turn of events Russia defaulted on its government bonds sending shockwaves through the global financial markets. Market deviations / volatility increased to unprecedented levels and liquidity dried up making it difficult for LTCM to unwind positions at reasonable prices. LTCM was thereby left exposed and collapsed; investors went from being 'up' 200% to facing catastrophic losses within weeks.

This was an extreme event that the Gaussian distribution based award winning models said wasn't supposed to happen, but it did. This was, as coined by Nassim Nicholas Taleb, a 'Black Swan'.

Wrong Lesson

LTCM served as a cautionary tale, but very little was said and understood at the time about the nature of extreme events / Black Swans. While people came to see the limitations of financial models, they remained blind to other sources of uncertainties. In 2008, widespread beliefs that housing prices would continue to rise and default rates would remain low, fueled the mortgage-backed securities bubble that ultimately led to a global financial crisis. The beliefs were, again, based on careful analysis of historical data and 'experts' misinterpreted 'no evidence of anomalies' as 'evidence of no anomalies'. Once again, people came to see the risk associated with the housing market, but remained blind to other sources of uncertainties.

In 2023, Silicon Valley Bank (SVB) faced a bank run and ultimately failed. The crisis was triggered by rising interest rates that left them facing catastrophic losses on their US Treasury bonds (bonds lose value if there's a relative increase in rates as it makes new issues more attractive). SVB did not effectively manage its interest rate risk because it overlooked the fact that inflation rates distributions exhibit 'fat-tails'. So when inflation started rising in 2021 and the Fed subsequently raised interest rates in response (I'll cover monetary policy in a separate essay), the tide went out on SVB. Yet again, people now see the risk associated with interest rates, but remain blind to other sources of uncertainties.

Survival

In the aftermath of any crisis, we become more informed about the specific factors that were overlooked. However, this new knowledge has its limits, as it's unlikely we'll be fooled twice by the same source of uncertainty. Once known, a Black Swan ceases to be a Black Swan. The greater lesson is learning to deal with uncertainties beyond those we know to plan for.

Black Swans cannot be predicted and attempting to calculate their precise probabilities is a futile effort. Instead, base your decisions on the consequences you are exposed to under any scenario. Mitigate the consequences that can ruin you. Always, without exception.

Don't cross a river if it's, on average, four feet deep.


This essay was inspired by Nassim Nicholas Taleb's 'Incerto' book series.

Disclaimer: All content is for informational purposes only and should not be construed as legal, tax, investment, or financial advice. All opinions expressed are strictly my own and do not represent those of any company or third party.

Subscribe to Sid Suri

Don’t miss out on the latest issues. Sign up now to get access to the library of members-only issues.
jamie@example.com
Subscribe