“The Money Formula: Dodgy Finance, Pseudo Science, and How Mathematicians Took over the Markets”, by Paul Wilmott and David Orrell (2017)

Introduction

Wilmott and Orrell’s aim in this book is to explain 

  • how quantitative analysis works,
  • where it is fit for purpose,
  • but more importantly where it is not.

The first five chapters cover the history of quantitative finance explaining key principles, such as risk analysis, bond pricing, portfolio insurance. The second part is about the quantitative finance industry today, and how it is evolving.

The overarching theme of the book is to show how the traditional quantitative finance techniques the financial system have increasingly come to rely on,

  • failed during the crisis;
  • have yet to be properly reinvented; and
  • continue to put the financial system at risk

According to Wilmott and Orrell, a big part of the problem is that quants

  • treat the economy as if it obeys mechanistic Newtonian laws, and
  • have no feel for the chaos, irrationality, and violent disequilibrium to which markets often seem prone.

There are plenty of critiques of modelling and quantitative finance by outsiders throwing rocks but this book is distinguished by the fact that Wilmott is a quant and brings an insider’s technical knowledge to the question of what it can and can’t do. Consequently, the book offers a more nuanced perspective on the strengths and limitations of quantitative finance as opposed to the let’s scrap the whole thing school of thought.

The notes below are my attempt to summarise the book. Hopefully other readers may find the summary useful but, as with all book reviews on this website, its primary purpose is to help me understand what the book has to say. I have added emphasis in parts of the summary in bold text and my own comments or observations in italics. I am not a quant or financial modeller so my summary and comments should be read with that in mind.

The core idea I took away is that you should not treat the economy as if it were governed by mechanical laws. In particular, be wary of models that assume the system being modelled tends to equilibrium and ignore the role of money as an active part of the system.

Chapter 1 – Early Models

The first chapter

  • traces the development of economics, noting the extent to which it has been heavily influenced by physics; and
  • looks at the basic assumptions such as equilibrium and rationality that have shaped both economics and finance; and

The authors use John Law and Sir Isaac Newton to represent two aspects of the relationship between mathematics and finance that have been useful when used within limits but proved to be problematic when taken too far

  • Newtonian physics inspired a school of mathematical finance to build objective, rational, models to simulate markets and make predictions about their future
  • Law is representative of the innovative spirit attracted to practical finance because they think they can use a system to beat the market, or to  create entirely new markets and new forms of credit, which boost the money supply, at least for a while.

One of the themes of the book is to explore the tension between these two aspects of mathematical finance, on the one hand driving its inventiveness and creativity but also its tendency toward self-destruction.

The Systems of Nature and Rational Mechanics

Newton was obviously not a financial modeller (though he did serve for a time as Warden of the Mint in England) but Wilmott and Orrell argue that the ideas he developed in physics “… probably did more to shape the world of mathematical finance than any other scientist”.

Economists found inspiration in physics for the problems they were grappling with.  Wilmott and Orrell cite Adam Smith as one influential economist whose ideas were influenced by Newton, perhaps most famously in the idea of the “invisible hand” under which each individual is “led by an invisible hand to promote an end which was no part of his intention.”

Paul Samuelson is credited with popularising the idea in his best-selling textbook (Economics) but they also argue that Smith’s work was influential on the USA at the time of its formation and remains so today. Economist George Akerlof is quoted as describing the “central ideology” of the United States as conforming to “the fundamental view of Adam Smith,” which even today “drives huge amounts of policy”

“According to this picture, the market is made up of firms and individuals acting to further their self-interest by buying and selling…The invisible hand is the market version of gravity… This view of society as a collection of atomistic individuals, each pursuing their economic self-interest, was modeled directly after Newton’s view of nature as a mechanistic, law-bound system.”

“Any model is a simplification of reality, and the neoclassical economists had to make some rather sweeping assumptions in order to make progress. The most basic of these was that people act to optimize their own utility….Thus was born the notion of homo economicus, or rational economic man. While these assumptions had obvious flaws … they did allow economists to construct elegant mathematical models of the economy.”

Comment: The idea of  an “invisible hand” offers a powerful and elegant justification for anyone who wants to believe that markets produce optimal economic outcomes. Note also the role of ideas and narratives not only in explaining what is happening but also shaping how the economy evolves. 

Finding Equilibrium

The physics analogy provided a model for economists but did require some tweaking

  • physical quantities could be measured in well-defined units, while the economic concept of “utility” was rather vague and no one knew what its units were
  • while the atoms of physics are believed to have the same properties everywhere in the universe, people (the atoms of the economy) show a high degree of variability.

The first problem was addressed by assuming that utility could be inferred from market prices while the variability of individual behaviour was argued to average out over large populations such that a “representative” agent could capture the behaviours that mattered. The computational challenges were further simplified by assuming that prices were at equilibrium, which made sense if you assumed that people tended on average to interact with markets in a rational way.

Intrinsic Value

Concepts such as rationality and equilibrium remained at the heart of economic theory as it developed over the 20th Century but did face a series of challenges. These models were elegant and computationally convenient but proved to be no better than random guessing if the objective was to predict economic outcomes. This, in Willmott and Orrell’s narrative is where Eugene Fama’s idea of market efficiency enters the picture to explain why economists were doing such a poor job of predicting the future. 

“His efficient market hypothesis portrayed the market as a swarm of “rational profit maximizers” who drive the price of any security to its “intrinsic value.” It was therefore impossible to beat or out-predict the market, because any information would already be priced in. The invisible hand of the market was the epitome of rationality.”

Wilmott and Orrell’s other key point is that the assumptions of neoclassical economists had a dual nature.

“On the one hand, they were designed to make the economy mathematically tractable.”

“On the other hand, they shaped the way that we see and model the economy –as a beautifully rational, stable, and efficient system –which [in turn] shaped the economy itself.”

The next chapter of the book explores how the “elegant but unrealistic assumptions and formulas” were reconciled with markets that appeared to be driven more often by “chaos than by reason”.

Comment: The idea of market efficiency was an important part of the conceptual foundation of quantitative finance as it offered an explanation for why markets were unpredictable. The traditional explanation is that randomness in prices proves the market is efficient but Wilmott and Orrell offer a different explanation. They are not saying the market efficiency explanation is completely wrong. Just that you need to understand its limitations. Perhaps more importantly, they argue that these concepts (invisible hand and efficient markets) did not just explain the economy, they started to shape the economy itself

Chapter 2 – Going Random

“Quantitative finance is about using mathematics to understand the evolution of markets. One approach to prediction is to build deterministic Newtonian models of the system. Alternatively, one can make probabilistic models based on statistics. In practice, scientists usually use a combination of these approaches.”

Chapter 2 looks at how probability theory is applied in financial modelling. Expected value ( the payout multiplied by the probability) is one of the key concepts applied to model financial outcomes.

Computing expected value can be complex but the mathematician Abraham de Moivre showed that, after an infinitely large number of iterations, the results would converge on the so-called normal distribution, or bell curve. This is computationally useful in modelling random processes because the distribution can be specified by two numbers: the mean or average and the standard deviation. The caveat is that a number of conditions must be met. In particular, the separate processes have to be independent of one another, and identically distributed.

“The normal distribution is perhaps the closest the field of statistics comes to a Newtonian formula. The equation is simple and elegant, there are only two parameters that need to be measured (the mean and the standard deviation), and it can be applied to a wide range of phenomena [including] perhaps its greatest application in mastering, or appearing to master, the chaos of the markets.”

Theory of Speculation meets Efficient Markets

“The desire to bring order out of chaos, and to see the hidden pattern in the noise, is basic to human nature… even chaos theory is not so much about chaos as about showing that what appears to be wild and unruly behavior can actually be explained by a simple equation”

Chapter 2 considers the different assumptions about the world embedded in Louis Bachelier’s “Theory of Speculation” and Fama Efficient Market Hypothesis. Bachelier had implied that not only was news random, but so was the reaction of investors, with “events, current or expected, often bearing no apparent relation to price variation.” Fama, in contrast, argued that “… an efficient market always reacts in the appropriate way to external shocks; since if this were not the case, then a rational investor would be able to see that the market was over or under-reacting, and profit from the situation. The market’s collective wisdom emerged automatically from the actions of rational investors.”

Wilmott and Orrell challenge Fama’s EMH explanation arguing that …

“While the EMH is consistent with markets being unpredictable,
you cannot conclude from the unpredictability of markets that they are efficient.”

“A simpler explanation is that the system is driven by complex dynamics that resist numerical prediction.”

Comment: I think this is a useful perspective.

Irrational Markets

Wilmott and Orrell cite behavioral economics as providing evidence that investment decisions are based on many factors which have little to do with rationality.

“… one could argue with the neoclassical economists that these peculiarities come out in the wash for a large number of investors. But in a situation like the markets, where investors are influencing and reacting to one another, the opposite is probably true.”

Comment: Another useful challenge to the conventional explanation based on market efficiency

Not Normal

Wilmott and Orrell argue that the assumption that markets seek equilibrium is also problematic, noting that price changes do not follow a normal distribution, as in the random walk model, but are better modeled by an equally ubiquitous formula known as a power-law distribution,

“The normal distribution is symmetric and has a precise average, which defines a sense of scale. A power-law distribution, in contrast, is asymmetric and scale free, in the sense that there is no typical or normal representative.”

“The power-law distribution is a signature of systems that are operating .. far from equilibrium, in the sense that a small perturbation can set off a cascade of events …”

“The normal distribution was derived for cases where processes are random and independent from one another, but markets are made up of highly connected people all reacting to one another. The idea that markets are inherently stable is therefore highly misleading, and … has led to many problems in quantitative finance”

Notwithstanding its limitations, the normal distribution continued to be widely used

Wilmott and Orrell conclude this chapter with a consideration of why these models continue to be used so widely when the problems with them are so well understood.

Explanations include because the status quo is easier to work with

One reason is simply that it enabled both economists and quants to continue using the standard statistical tools they were comfortable with. Power laws may be ubiquitous in nature and finance, but they lack the symmetry, ease of calibration, and mathematical usefulness of the normal distribution.”

Or that we have nothing better

“… as economist Myron Scholes put it, “To say somethings failed, you have to have something to replace it, and so far we don’t have a new paradigm to replace efficient markets”.

“no one has a perfect model of the economy and so the theory remains in place. Like some kind of mental virus, it has found a way to disable the usual processes that would get rid of it”

Or behavioural finance

“… it is typical human behavior to deny there is a problem, cling to the illusion of validity, maintain the status quo, and thus avoid loss.”

Chapter 3 – Risk Management

“In the 1950’s, economists began to apply probability theory to the problem of asset allocation, and showed how to put numbers on concepts such as risk and reward”

The seeds were being sown for a dramatic shift from finance as art to finance as science – or at least, something that looked a lot like science. But can risk and reward be reduced to hard numbers?”

There are several ways in which to analyze stocks, the three most important being

  • fundamental analysis,
  • technical analysis, and
  • quantitative analysis.

Wilmott and Orrell consider each in turn but I will focus on their discussion of quantitative analysis.

Quant Analysis

Wilmott and Orrell credit Harry Markowitz and his modern portfolio theory (published in 1952) for bringing a quantitative risk and reward lens into finance. His great insight was to quantify share price behavior in a probabilistic sense, and relate it to the idea of risk and the trade-off between risk and reward in particular.

Markowitz expressed this trade-off by representing the behavior of an individual share over a set time horizon in terms of two parameters: the expected return and the standard deviation which were estimated using historical time series.

“Of course, we can’t guarantee that the future parameters will be the same as the historical ones. Using past data to estimate future returns looks suspiciously like a chartist using a trendline to predict the future. And there is no reason why volatility, as measured by the standard deviation, should be stable either.”

But the method offered an appealing and intuitive way of analysing what stocks to hold. Even better, Markowitz also looked at how two stocks behave together, and then how entire portfolios of stocks behave using the concept of correlation.

Correlation

The methodology, at face value, offers a way of choosing a portfolio that maximises expected return for a given amount of risk. This is obviously a hugely appealing idea appearing to offer a rigorous way to build an optimised and efficient portfolio; and who can resist something that is optimised and efficient?

Where is the catch? Wilmott and Orrell call out the need to ensure the assumptions fit the problem that the methodology is being applied to

“The problem with MPT, and with all of quantitative methods in finance, is that it is only as good as the underlying assumptions. Some of these concern the basic properties of markets. Like the efficient market hypothesis, MPT assumes that investors act rationally to further their self-interest, make decisions independently, have access to similar levels of information, etc. As a result, stock prices follow a random walk, with an upward bias that corresponds to the average growth rate and daily changes that follow a normal distribution.

So far, nothing new. But in addition, MPT assumes that we can measure meaningful correlations between different securities.”

And in particular the stability of the parameters being used…

“The problem is not so much the number of parameters that need to be measured, because the method is relatively simple. No, it’s more a question of the stability of the parameters.”

“This can particularly be a problem during a market crash, when asset price changes tend to be highly correlated because they are all falling together.”

“But perhaps the most important assumptions, whose problems go to the core of the theory, are that we can compute expected risk and reward for each stock in the first place. Astute, or skeptical, readers may have noticed that MPT asks us to input expected growth rates for individual stocks; but … it is not possible to accurately predict expected returns using either fundamental or technical analysis. This empirical fact was one of the main justifications for efficient market theory.”

“Just as concerning the idea that we can measure risk using the standard deviation of past price changes. When we estimate returns, we are making a prediction about the future; but when we estimate risk, we are predicting the uncertainty inner forecast … which is even more difficult. The standard deviation tells us something about past fluctuations, but there is no reason why it should remain constant.”

Willmott and Orrell note that there are ways around the stability problem but find none of them very appealing. They also call out the fact that standard deviation treats positive and negative volatility as equivalent.

Efficiency Squared

Wilmott and Orrell have already highlighted the importance of understanding the assumptions the model is built on but it is also important to recognise where the models start to influence the market and thereby undermine the assumptions on which the model was built. Black Monday is cited as an example where the widespread use of similar models to manage investments helped to make the individual stocks more correlated. Index funds are another example.

“The success of index funds is routinely supplied as evidence that markets are efficient. But a better way to look at it is that index funds are a very good business model that acts as a kind of parasite on the financial system.”

“… has the positive effect of keeping industry management fees in check. But if every fund adopted an idea approach, the system would fall apart …”

Value at Risk

“Perhaps the main contribution to come out of portfolio theory, though, was that asset managers were now quantifying their strategies, they were measuring expected returns and risk, and balancing them off as two sides of the same coin.”

“A methodology called “value at risk” (VaR) began to gain traction. This was based the portfolio risk measurement used MPT and gave … a single number designed to give a sense of how much a bank might be expected to lose. In its basic form VaR has two key elements. The first is a degree of confidence: 95%, say. The second is a time horizon: 1 day, say.”

VaR has come under a great deal of criticism. Wilmott and Orrell argue that most of the criticisms can be addressed by different mathematics but two of them are not so easily addressed
First, it creates dangerous incentives and second, it is easily abused,

  • One of the dangerous incentives is that it encourages taking tail risk and this kind of risk can wipe you out
  • If the model is based on unreliable assumptions, and if the parameters are unstable, then it is easy to choose the model or the parameters to make the reported risk as low as possible or otherwise give the user (or their boss) the number they want to see.

The Edge of Chaos

“One of the main advantages of using hard numbers to measure risk is that it is supposed to make decisions scientific and objective. But clearly, if a trader can adjust his VaR calculation in order to please his boss, something strange is going on with the mathematics itself. The process looks objective, but is actually subjective.

The reason for this flexibility can be traced back to the abovementioned fact that portfolio theory is based on the idea that price changes follow a normal distribution, with a stable and easily measured standard deviation. Real price data tend to follow something closer to a power-law distribution, and are characterized by extreme events and bursts of intense volatility, which as discussed earlier are typical of complex systems that are operating at a state known as self-organized criticality.”

“Theories such as MPT or VaR fail just when you need them most, in the moments when apparent stability breaks down to reveal the powerful forces beneath. The reason is that they model the financial system in terms of random perturbations to an underlying equilibrium, and can’t handle the inherent wildness of markets, where storms can come out of nowhere. In particular … they ignore the nonlinear dynamics of money, contagion between institutions due to network effects, and the bad things that happen when credit suddenly dries up.”

The next chapter looks at how efforts to eliminate risk altogether resulted in risk “mutating into new, and even more virulent, forms”

Chapter 4 – Market Makers

In this chapter, Wilmott and Orrell show how mathematicians developed formulas for valuing options –and in doing so completely changed the market for them.

The first part of the chapter sets out the back story behind options and how to price them. Black, Scholes and Merton are usually cited here but Wilmott and Orrell also recognise the work done by Ed Thorpe. The more important part of their narrative is the way that the development of the pricing formula for options unlocked an explosion of activity by placing option trading on firmer ground and creating new opportunities to make money.

“The formula also contained within the promise of a perfect, automated system for making money. BY dynamically hedging their bets, those who understood the Black-Scholes formula could exploit anomalies in … markets to make what appeared to be risk-free profits…. Finance now existed on a higher mathematical plane …”

Option trading seemed to be a virtuous activity that eliminated risk and moved the world towards more efficient and optimal outcomes. It followed therefore that regulation should cooperate in this grand project.

In 2000, Alan Greenspan testified to Congress that this ability to hedge risk had made the financial system more robust: “I believe that the general growth enlarge institutions has occurred in the context of an underlying structure in markets in which many of the larger risks are dramatically – I should say fully – hedged”

The option pricing formula seemed to make the markets more efficient

“As traders began to adopt the formula, prices converged so that it was more difficult to arbitrage between stock and option prices. A rule of finance, known as the “law of one price,” says that the price of a security, commodity, or asset will be the same anywhere once things like exchange rates and expenses are taken into consideration, since otherwise an arbitrageur can buy cheap in one place and sell in another.

However Wilmott and Orrell caution that

“… the fact that markets agree on one price does not necessarily mean they have converged to the right price (whatever that is) or that the price will be stable. The Black–Scholes model is an elegant equation which is useful so long as its limitations are understood; but any formula which is based on the perfect, symmetrical, stable, rational, and normal world of abstract economics, where investors can effectively make predictions about the future of a stock based on nothing more than past volatility, will never be a realistic model.”

They also argue that the disassociation from gambling was not entirely positive.

“Gamblers are aware that they are dealing with risk and can lose their stake. The idea that in finance you could even come close to eliminating risk through the use of hedging strategies, in contrast, led some firms (not Thorp’s) to a dangerous hubris.”

Chapter 5 – Deriving Derivatives

“Once the markets had a model for valuing derivatives there was no longer any excuse for not trading them. The market in options exploded. New financial instruments were created using the same kinds of mathematical model… new and increasingly complicated instruments. As the instruments got more complicated, so did the mathematical models.”

In Chapter 5, Wilmott and Orrell review how, in their words, “… the brilliant idea of hedging was stretched to breaking point and beyond.”

One of the underlying themes in this chapter is complexity. In particular, Wilmott and Orrell call out the fact that complexity usually works in favour of the person selling the complex thing because the onus is on the buyer to figure out how to get maximum value.

In seeking to understand models, they also argue that it is useful to keep in mind the three main ways that quants eliminate, or at least reduce, risks (1)Diversify, 2) Delta hedge and 3) Allow for the Worst case.

One obvious problem is that you can’t always delta hedge

W&O point out with the problem that delta hedging is much harder in some markets than in others but it was easier for quants to assume (they use the word “pretend”) that it worked all the time.

“Delta hedging … is much harder in some markets than others. But this wasn’t going to stop the quants pretending that it worked”

They acknowledge that the distinction might seem a bit esoteric but, for them, it is important because it marks …

“… possibly the first, or most important, time that quants started cheating. Or maybe let’s just call it brushing things under the carpet. The difference between the traded and the untraded is one of the key distinctions between good models and poor.” (emphasis added)

W&O note that, when the underlying is easy to hedge with, then Black-Scholes leads to an equation with one unknown but this is not the case for something like an interest-rate product where you end up with two unknowns.  The problem might be resolved if you could observe the market price of risk but they point out the real world difficulties with doing what seems simple and straightforward in the textbook.

When you read the economics or finance textbooks you get the impression that the market price of risk is something objective and stable (the measure of how much compensation above the risk-free rate one requires for taking a unit of risk). But in practice the market price of risk is unstable. It’s also different for each source of risk; each stock has one, so do rates, currencies, etc. And it’s not easy to measure.” (emphasis added)

This in W&O’s account is where many quants took a turn to the dark side.

“It was at this point –as quants issued more and more of these complex, unhedgeable instruments –that they made a sort of collective decision to not worry. At precisely the point where they should have. The Black–Scholes model was looking pretty good for traded underlyings, and so they wanted to use it when the underlying was not traded, even though it came with some major drawbacks. It was tempting to look the other way.” (emphasis added)

Getting Carried Away
Making this compromise to not worry too much about the practicality of valuation when the underlying is not traded, opens up the capacity to extend derivatives to the following:
Credit. Macro. Inflation. Property. Energy. Weather.

“We are starting to see the appeal of quant finance to the mathematician. … We have mathematical modelling, financial concepts to turn into mathematical principles. We have differential equations and free boundaries. Sometimes we have nice formulas. If we can’t find formulas then we have to do some complicated numerical analysis. And complicated can be fun”

From the Sublime to the Ridiculous

W&O reiterate the importance of good models and what makes them good.

  • They categorise modelling for shares, indices, exchange rates and commodities as the better class of models because they are “robust and internally consistent”. The underlying are traded so there is less room for model fudging and the quants all tend to use similar models.
  • Interest rate models are another category that W&O describe as “not great” for a variety of reasonsthere are many different models
    • different people use different models for the same instrument
    • inability to hedge consistently
    • the huge scale of the market is interest rate derivatives also increases the risk for a systemic disaster
    • The only mitigating factor is that the volatility is generally low which reduces the potential for any of the above matter
  • W&O designate credit-risk models as worse because …You can’t hedge.
    • You don’t know how to model default. Default isn’t random, governed by the roll of a die –it’s a business decision.
    • There’s no data for specific companies, since bankruptcy tends to be a one-off event.
    • Volatility in risky businesses can be huge. And the market in credit instruments is large.

In their view, credit modeling is so bad, and credit instruments so dangerous,  that it is worth digging deeper into this part of the derivatives market, in particular the infamous collateralized debt obligation (CDO) instrument. The problem that W&O highlight is not the CDO itself. They view the basic technology as a good thing for dispersing risk, their concern lies in the way that the technology was abused and the limitations of credit risk modelling ignored. In their words

“As financial investments they are wonderful things'”

“However, from a quant finance modelling perspective these instruments are horrendous”

The copula is called out in particular

“One of the models used to value CDOs is the “copula.” This is a mathematical idea in probability theory that helps you analyze the behavior of multiple random variables, here the random variables being default.

“… with CDOs it’s not so much on any particular model that we can pin blame for the credit crisis that hit in 2008. No, it’s more a problem that there’s no model that is going to give you a value that you’d be able to sell at while giving you a mechanism for hedging risk.” (emphasis added)

The size of the market they argue is especially problematic especially since derivative pricing models played a role in facilitating that growth

“And that’s really the dangerous part… the size of the market in these instruments.
Not being able to value or hedge doesn’t matter too much if the trades are small, but once the size gets enormous you get systematic risk that could bring down the whole system.
And the size did get enormous, and that’s because there was the illusion that these contracts could be valued and hedged, resulting in the biggest false sense of security in history. And it was the dubious quant models that played a key enabling role.”

Money Crunch

W&O next discuss the way in which the expansion of financial instruments like CDO’s contributed to an expansion of the money supply

“The net effect of financial instruments such as CDOs and CDSs was not a reduction in risk, but a huge expansion in money and credit. We tend to think of the money supply as being something that is controlled by the central bank, while in fact the vast majority of money is created by lending from private banks.”

“The reason the credit crunch of 2007 did so much damage was that it was in fact a money crunch, similar in spirit to the one which John Law unleashed on France in the 17th century, but on a global scale. Quants didn’t set out like Law to print money, but that was the emergent effect of their endeavors.”

“A particular property of money, which rivals quantum physics for its weirdness, is the way that it is real, in the sense that its appearance has real effects on people and the economy … but when conditions change it can suddenly disappear into the ether, as if it had never existed.

Chapter 6 – What Quants Do

This chapter charts the rise of quants within the ranks of banking, the different types of quant (junior quant, model validation, quant developer, risk management, research quant, front office or desk quant, quant trader), what they get paid and what they do. This section also touches on the question of whether they add value to society that is commensurate with their pay.

Chapter 7 – The Rewrite

“This chapter looks at the process of calibration, and shows that model tuning is often as much about fixing appearances, or rewriting reality, as it is about performance.” (emphasis added)

The overall theme is theat the calibration process is another area in quant modelling where the theoretical quest for accuracy that is the nominal objective can get lost for a variety of reasons.

Blowing Smoke

“Calibration is an example of what are called in mathematical circles “inverse problems”. In most physical problems., you are usually trying to figure out from a model how something might behave in the future. Weather forecasting would be a good example. But sometimes you want to go backwards””

W&O argue that working backwards is appropriate for some problems but not for others and finance is an area where the approach can take you in the wrong direction

“Calibration in finance shares some of the problems of the diffusion problem. As described in Chapter 2, share prices can be modeled as diffusing in time as they are jostled around by random currents, rather like a particle of smoke.
Option prices tell you something about what traders think the smoke pattern will look like after a certain time. The Black–Scholes model relies for its accuracy on a single key parameter, the volatility, which is assumed to sum up everything you need to know about a security’s behavior. So, if the model were an accurate description of reality, then the inverse problem for any option on a single underlying would also always yield a single number.”

Calibrating the Crystal Ball

“… let’s do a sanity check. Does it really make sense that future volatility –the amount of variability in a share price –is a function of asset price and time?”

Ultimately they conclude

“The real purpose of calibration, it seems, is to fix the appearances of the model, and provide what looks like a mathematically consistent story.”

This is where the account switches from math and models to motivation and ways of thinking

“So far it’s all been mathematics and models. Now we have to understand how the quant thinks and his motivations, not to mention the thoughts and motivations of his bosses”

Sources of Confusion

The first issue W&O call out is the extent to which certain model parameters can be assumed to be fixed. The reality is that models need to be regularly recalibrated to realign them with the market value for volatility so that begs the question whether a model which assumes a fixed value is a good model.

The difference between price and value is another source of confusion,

Model Risk

W&O argue that the regular recalibration of models is evidence that they are not stable and reliable

“In quantitative finance there’s always a question about the accuracy of models. This is termed “model risk.” There are many, many forms of risk, all of which the responsible bank will try to measure and if necessary reduce. However, if we are constantly recalibrating it means that we never get to see the risk in the volatility model.”

They use an analogy with engineering to explain how this is dangerous

Engineers can build systems where they can reliably predict responses provided the system components are operating within their assumed operating tolerance (e.g. an airplane’s rudder is not subject to forces that cause it to break).

In contrast

“… financial derivatives, and therefore much of the financial system, are cobbled together from components such as implied volatility, which are highly unstable and unreliable –so you can bet the whole is as well.”

Flying Blind

“It might seem that these problems are reasonably obvious, and it is true that the more sophisticated banker is aware of them … [but] … There are strong incentives to go with the flow.”

Two main justifications are commonly used.

“The first is that the method may not be perfect, but it is always possible to hedge the derivatives using exchange-traded vanillas, which mitigates the risk. This isn’t too bad a justification –as long as it’s right. Unfortunately, it’s not only hard to estimate the model risk from this sort of hedging, it’s also something that people take on faith, and they rarely try to estimate the remaining model error in practice.”

“The second, more scary but very common, justification is: what else can we do?
The banker says: “We need to trade, we need a model, this is what we’ve got, there’s nothing better, we use it. Leaving aside the question of whether there is a better model, this justification makes you wonder about the morals here.”

While the calibration problems covered thus far are of concern, W&O argue that they are really just symptoms of a larger problem

 “… we consider the fundamental cause of model risk … [is the] … the category error of treating a human system as a mechanical one.”

Fundamental problem in modelling is treating a human system as a mechanical one

Chapter 8 – No Laws, Only Toys

“For the purposes of modeling, we’ll say that a law is a relationship that has been extensively tested and can be treated as fixed and certain within a certain domain.”

The key elements of this process are

  • reproducibility,
  • prediction,
  • and simplicity.
  • Plus knowing where the model breaks down.

These kinds of laws can be found in physics but W&O argue that quantitative finance does not have any fundamental laws. This will not doubt challenge the world view of many economists but they offer the “law of supply and demand” to illustrate their point

 “… the “law of supply and demand” …states that the market for a particular product has a certain supply, which tends to increase as the price goes up (more suppliers enter the market). There is also a certain demand for the product, which increases as the price goes down.”

“… while the supply and demand picture might capture a general fuzzy principle, it is far from being a law. For one thing, there is no such thing as a stable “demand” that we can measure independently –there are only transactions.”

“Also, the desire for a product is not independent of supply, or other factors, so it isn’t possible to think of supply and demand as two separate lines. Part of the attraction of luxury goods –or for that matter more basic things, such as housing –is exactly that their supply is limited. And when their price goes up, they are often perceived as more desirable, not less.” (emphasis added)

“no-arbitrage” similarly is a principle which in their view puts fuzzy bounds on the relative prices among all the instruments but does not represent a law which defines precise outcomes in the way the laws of physics do.

In lieu of laws, W& O suggest the following ….

“… while there are no fixed laws in financial modeling, there are clues that can point us in the right direction. Or rather, there’s one clue. In the whole of quant finance there is only really one peg onto which we can hang our modeling hat.”

The one clue to modeling a share price is…The absolute value of the share price doesn’t matter, but its value relative to the past does matter.

“In other words, all that matters is its return.” (emphasis added)

Back to Basics

“Why is this an important clue for us modelers? Because it means that in any model we build up we should first study data for the returns, and then model these returns.” (emphasis added)

W&O paint a hypothetical example of some modellers trying to model the return on a stock price index. They note that modelling becomes easier if you can use equations but further assumptions must then be made including what kind of distribution best describes the data. This is where the normal distribution comes in because it is easy to work with.

So their point is that the normal distribution gets used more often than not and in so doing modellers forego potentially more accurate, albeit unwieldy, model for what they label a “toy model”. This is not a disaster in their view so long as you remember the limitations of the model and its “zone of validity”

Even though it’s a toy model, it contains a couple of useful ideas that can then be used throughout quantitative finance. These two ideas are just the two parameters in the normal distribution – the average, which tells you the expected return, and the standard deviation, which tells you the volatility. As we’ve seen throughout this book, these are very useful, intuitively understandable concepts. But we have to remember their zone of validity.” (emphasis added)

A Model for Interest Rates?

Notwithstanding the criticisms W&O throw at the normal distribution, they accept that these kinds of models are reasonable in some applications.; “so-so accurate for stocks” in their words. The problem they argue is that these models don’t work for interest rates.

“The only half-decent, yet still toy, models in finance are the lognormal random walk models for those instruments whose level we don’t care about. That’s equities, indices, exchange rates, commodities. This is why almost everyone is using the lognormal random walk model for these quantities, but there isn’t a standard model for interest rates, everyone uses something different.”

A Role Model

W&O highlight mathematical biology in general and a book by Jim Murray on the topic as a source for better ways to approach many of the more difficult modelling challenges in finance and economics.

They start by listing a series of phenomena in biological models that seem to be useful analogues for what happens in financial markets. They identify a number of models used in mathematical biology that are almost all “toy” models. None of these models offer precise or determined outcomes but all can be used to explain what is happening in nature and offer insights into solutions for problems like disease control, epidemics, conservation etc.

Comment: The approach they advocate seems have a lot in common with the Agent Based Modelling approach that Haldane references (see his paper on “Tails of the Unexpected“) and that is the focus of Bookstabber’s book (“The End of Theory”).

In their words …

“Embrace the fact that the models are toy, and learn to work within any limitations.
Focus more attention on measuring and managing resulting model risk, and
less time on complicated new products.”

“… only by remaining both skeptical and agile can we learn. Keep your models simple, but remember they are just things you made up, and be ready to update them as new information comes in.”

Quantum Finance

” … while it is often said that finance models itself after physics, it is more accurate to say that it has modeled itself after Newtonian physics, which is not quite the same thing, being a little out of date.”

Comment: Andrew Haldane made a similar comparison in a paper on the rise of the normal distribution. His point was that economists embraced the probabilistic approach of quantum physics over the simple deterministic outcomes of Newtonian physics. W&O take this analogy in a slightly different direction picking on the way that the quantum nature of particles add a layer complexity to atomic interactions that results in physical systems having emergent properties that can not be explained by Newtonian physics. I will add a caveat here. I am not trained in physics so I may have misunderstood this analogy. I did find it interesting however and hopefully have captured the key point.

In their own words …

Consider …. the nature of money. Standard economic definitions of money concentrate on its roles as a “medium of exchange,” a “store of value,” and a “unit of account.” Economists such as Paul Samuelson have focused in particular on the first, defining money as “anything that serves as a commonly accepted medium of exchange.” This definition is similar to John Law’s definition of money as a “Sign of Transmission.” Money is therefore not something important in itself; it is only a kind of token. The overall picture is of the economy as a giant barter system, with money acting as an inert facilitator.” (emphasis added)

“However … money is far more interesting than that, and actually harbors its own kind of lively, dualistic properties. In particular, it merges two things, number and value, which have very different properties: number lives in the abstract, virtual world of mathematics, while valued objects live in the real world. But money seems to be an active part of the system. So ignoring it misses important relationships. The tension between these contradictory aspects is what gives money its powerful and paradoxical qualities.” (Emphasis added)

The real and the virtual become blurred, in physics or in finance. And just as Newtonian theories break down in physics, so our Newtonian approach to money breaks down in economics. In particular, one consequence is that we have tended to take debt less seriously than we should. (emphasis added)

Instead of facing up to the intrinsically uncertain nature of money and the economy, relaxing some of those tidy assumptions, accepting that markets have emergent properties that resist reduction to simple laws, and building a new and more realistic theory of economics, quants instead glommed on to the idea that, when a system is unpredictable, you can just switch to making probabilistic predictions.” (emphasis added)

“The efficient market hypothesis, for example, was based on the mechanical analogy that markets are stable and perturbed randomly by the actions of atomistic individuals. This led to probabilistic risk-analysis tools such as VaR. However, in reality, the “atoms” are not independent, but are closely linked … The result is the non-equilibrium behaviour … observed in real markets. Markets are unpredictable not because they are efficient, but because of a financial version of the uncertainty principle.” (emphasis added)

“… the great advantage of probabilistic predictions is that they sound authoritative, but are hard to prove wrong because to do so takes a great deal of data”

Finance … took exactly the wrong lesson from the quantum revolution. It held on to its Newtonian, mechanistic, symmetric picture of an intrinsically stable economy guided to equilibrium by Adam Smith’s invisible hand. But it adopted the probabilistic mathematics of stochastic calculus.” (emphasis added)

I have quoted the book at length here but I think it makes some very good points

  • Money is not just an inert medium of exchange;
  • Economists treat it like it is just a numerical concept but money is also connected to the value of things and this dual nature (number and value) matters
  • Ignoring the role of money may also explain why debt gets less attention than it deserves
  • Market efficiency was used to explain why markets are unpredictable but W&O argue that there is a core element of uncertainty that must be recognised for a more complete explanation of why markets are unpredictable

Order and Chaos

To summarise W&O’s position

  • Markets are not determined by fundamental laws, deterministic or probabilistic.
  • Instead, they are the emergent result of complex transactions.
  • This changes the way that we see financial modeling.
  • In particular, money should play a central role, similar to that of a biologically active substance.
  • One of the more obvious properties of money is that it has a profound effect on human psychology.
  • It therefore seems bizarre that economics and finance, since the time of Adam Smith, have treated money as nothing more than an inert medium of exchange.

W&O discuss various ways that you might do this. Incorporating a financial sector into traditional economic models is one option but W&O do not see this as a desirable path because the added complexity risks turning the models into black boxes.

They advocate simplicity and knowing the limitations of your model

“The key then is to keep with simple models, but make sure that the model is capturing the key dynamics of the system, and only use it within its zone of validity. Models should be seen as imperfect patches, rather than as accurate representations of the complete system. Instead of attempting … a better, more complete “theory of everything”, the aim is to find models that are useful for  a particular purpose, and know when they break down”. (emphasis added)

They acknowledge the attraction of just letting computers looks for patterns in the data but also the limitations in that this approach tends to generate black box answers without the ability to understand what is driving the patterns, to test hypotheses or to make predictions were prior data is not available.

So they propose a more humble and limited approach which is most resilient to the risk of error ...

Perhaps the best approach is to use a mix of techniques, while being aware of the advantages and disadvantages of each. The worst is to pretend that toy models are actually fundamental laws of the universe –and then bet a quadrillion dollars on them. So this is where abstract ideas about models have very real implications. If you think models are just useful approximations to the far more complex reality, you tend to be more careful about using them. (emphasis added)

At the same time, they are somewhat sceptical about the extent to which the more humble and limited approach to financial modelling they advocate will gain converts in the real world.

“… whether dealing with investors, regulators, or the public, or even just for making money, accuracy isn’t really the point. What counts is the impression of accuracy, which is much better served by a model in which the economy is at equilibrium, and risk can be precisely calculated, than one in which he economy is far from equilibrium and risk is essentially unquantifiable”. (emphasis added)

Comment: The point they make regarding the enduring attraction of models which appear to offer accuracy is a good one I think.  

Chapter 9 – How to Abuse the System

W&O set out three simple examples of how, in their words

“… the bonus system … encourages dangerous practices such as concentration of risk, and the selling of things for less than they’re worth”

“… the grey area in which models can be used to hide risk, and to encourage risk taking”

“… how dangerous it is to rely solely on the numbers, without any sanity checking”

Their explanation for how these practices persist is ..

“The common thread … is the industrial scale abuse of mathematical models in order to optimise the quant’s interests rather than those of the client.

  • Using flawed but industry-standard models because they are safe, for the quant.
  • Selling products which are destined to eventually blow up, but only after manager has collected his fee.
  • Adjusting the model to give the desired result.

In each of these cases, the model is there less to elucidate the truth, than to provide a plausible story for a particular course of action. Quants use the apparently objective, detached and impartial nature of mathematical formulas as a kind of concealment, but also a stamp of certification.” (emphasis added)

Comment: W&O offer a fairly dark assessment of the standards applied in the quant industry which invite the reader to read this as a classic morality tale populated by corrupt agents. There will be elements with poor values like in any human activity. Some of what they describe predates quants; e.g. Keynes writing in the 1930s remarked that “Worldly wisdom teaches that it is better for reputation to fail conventionally than to succeed unconventionally”. So the more interesting question I think is why this industry is more at risk than others and less subject to scrutiny. 

W&O pick up this question …

“… how is it that the finance sector –after nearly blowing up the world financial system through its miscalculations –can continue to escape serious scrutiny? What makes it special?

The answer to this question, we believe, lies in the fact that only finance has learned to fully exploit the power of the ultimate defeat device –which as any math-phobe will remember from school is mathematical equations. It doesn’t just use formulas to dazzle –it imbues them with a kind of higher moral authority. It makes them into a consistent story. And by doing so it has achieved a remarkable degree of buy-in not just from those working in the field, but also from regulators, academia, the media, and the general public.

Whenever you make a mathematical model of a process, you are moving the system to the abstract plane of numbers. It seems to become objective and rational, free from the vagaries of human behavior or emotion. But quant finance (with its economist  apologists) goes further, because it manages to transfer these properties to the esteem itself. Market prices are seen as objective, rational, and intrinsically fair. If the price … spikes or plunges, that’s just the system at work. To criticise quantitative finance is therefore to criticise the markets themselves, which makes no sense because they are as objective and impartial as a physical phenomenon”

In an area such as engineering or biology, one can argue about a particular model and use experiment as a guide – the model and the system are seen as separate things. But the dominant lesson of mainstream economics, with its assumptions of stability, efficiency, and rationality, is that market price and value are one and the same. Models based on this theory are therefore seen as inviolable. …. There is little account for the fact, not only that the models are wrong, but that their use can affect the system itself”

Comment: I am not sure I buy their answer in its entirety. Think there is a pretty clear understanding that price is not the same thing as value. That said, it would be fair to say that the presumption of truth lies with the market price and that economic agents arguing that the price is wrong need to back their opinion with their money, which may be hard if they are leveraged and the bank they owe money to has no incentive to back their bet against the market price. The part of the argument which rings absolutely true is the extent to which math can imbue things with a higher moral authority that seems objective and rational to people who don’t understand math. Their explanation also ties in with behavioural bias where people look for a consistent and coherent narrative to explain what is going on. The explanation may be wrong or flawed but people prefer that to just facing up to the fact that things are complex. So a narrative gives a model an advantage over alternative approaches. The maths also discourages people from challenging the model. 

Chapter 10 – Systemic Threat

This extract from the beginning of Chapter 10 sums up one of the central themes of the book ..

“Because of the bankers’ insistence on treating complex finance as a university end-of-term exam in probability theory, many of the risks in the system are hidden. And when risks are hidden, one is led into a false sense of security. More risk is taken so that when the inevitable happens, it is worse than it could have been. Eventually the probabilities break down, disastrous events become correlated, the cascade of dominoes is triggered, and we have systemic risk. … None of this would matter if the numbers were small relate to world economic output, but the numbers are huge”

W&O devote a fair amount of attention to High-frequency trading (HFT). They argue that governments and regulators have been too willing to listen to the arguments put forward  by agents who benefit from HFT and far to sanguine about the risks. In their view, HFT is another in a long line of bandwagons that markets have jumped on in the pursuit of growth and profit.

But High-Speed Trading Provides Liquidity!

W&O reject the argument that HFT should be encouraged on the grounds that it provides liquidity …

“The main defense of high-speed algorithmic trading is that it adds liquidity to the market. And that this is a good thing, and therefore such trading is also a good thing.
This is a completely false argument … and it points toward yet another source of systemic risk, which goes to the heart of what markets are about.”

Their counter arguments

  1. They concede that greater liquidity does reduce transaction costs but argue that most investors purchase shares infrequently and are harmed more by the volatility that HFT adds than they save on reduced spreads
  2. They also argue that the liquidity provided by HFT is fair weather liquidity and will disappear when it is really needed

For W&O, the purpose of a formal market is less about instant liquidity than reliable price discovery and HFT does not ai price discovery.

Having explained the systemic risk they see associated with HFT being the next bandwagon the market embraces, W&O pull back to look at the system as a whole.

One of their broader thematic concerns is to recognise the tension between efficiency and resilience and the price to be paid to make the system more robust:

“With complex systems, there is usually a trade-off between efficiency and robustness. Increasing bank reserves makes banks less profitable, but also more secure. Introducing friction into the system –for example by putting regulatory brakes on HFT –will slow the markets, but also make them more transparent and reliable. If we want a more robust and resilient system then we probably need to agree to forego some efficiency And imposing a degree of modularity on the financial system –say by restructuring large global banks into smaller, more local entities –would reduce efficiency but also the likelihood of contagion.”

They argue that you have to examine the role of money to fully understand the problem …

“Perhaps the greatest structural risk to the financial system, though, is –the financial system. Or rather, money itself.

The idea of the “invisible hand” and market efficiency shapes much of the debate about what to do

Nearing the end of the book, W&O turn to the question of what is to be done and note that this question is inextricably linked to the role of free markets and whether regulation helps or hinders progress. They highlight the way that this debate has long been shaped by the “collective mental model … of the economy as a fundamentally stable and optimal system, controlled by the negative feedback of the invisible hand”

  • The idea of the “invisible hand” is typically associated with Adam Smith
  • This idea was translated into mathematical equations by which it could be proved by making assumptions about things like fairness and stability
  • The EMH (rational, independent investors driving prices to equilibrium) further builds on the theme
  • Quants then apply these ideas by “… modeling asset prices as a probabilistic, mechanistic system, spreading and dispersing in time …”
  • Money throughout treated as simply a number rather something of importance in the system itself

They also note that markets are at the centre of this model of how the economy operates

“Today, Keynesian economists promote government attempts to stabilise markets, and central banks tinker with them at will. However, the touted ability of properly managed markets to drive prices to their “natural” level remains the lynchpin of mainstream economics, and much quant finance, and even markets themselves, because it means that those prices correspond to something solid and reliable, instead of just being a transient, emergent phenomenon of the world economy”

Comment: I can see where they are going with this argument but it is probably an over-reach to describe prices as “just being a transient, emergent phenomenon”. If we come back to the point about “zone of validity”, then prices will tend to oscillate around a fair value most of the time. I think the point they are making is that prices can delink from value so there is no a priori reason why the price is always a useful guide under all states of the world. 

They use the conclusion of the chapter to reiterate some of the key themes

  1. Quants started by simply measuring and valuing risks in ways that helped their employers to better manage growth and risk but problems emerged when the models started (circa the 1980’s) to influence and ultimately shape the system by creating feedback loops between the models and reality
  2. HFT is, in their view, the latest in a long line of quant innovations that are influencing and undermining the proper function of markets
  3. Money matters – it attaches a number to the concept of value and markets assign meaning to the numbers. The numerical side of money encourages the idea that the economy is a simple physical system governed by mechanical laws but that ignores the extent to which human factors such as trust and belief enter the system via the value aspect of money.

Epilogue: Keep it Simple

This final section proposes some principles that a financial engineering code should include or reflect mostly based on the KISS principle “Keep it simple, stupid”.

Quants: The Math Sweet Spot

W&O take the view that math is necessary to price financial assets and liabilities but more and more complicated math is most definitely not better. They champion a math sweet spot, not too dumb, not too smart, where quants should focus. In this sweet spot we have basic tools of

  • probability theory,
  • a decent grasp of calculus, and
  • the important tools of numerical analysis.

The models are

  • advanced enough to be able to be creative with new instruments,
  • and robust enough not to fall over all the time.
  • They are transparent,

Regulators: Need to go on the attack

W&O concede they have been very critical of regulators but they also see them as “best placed to get financial engineering … back on the straight and narrow”. Better paid and qualified regulators is part of the answer in their view but so is better protection for whistleblowers.

Economists:

Economists have also played a central rile in W&O’s account of what has gone wrong. The problem here they argue is not so much the absence of alternative ways of approaching the problems but the institutional factors that truncate the exploration of alternatives. Until the situation improves, W&O argue that the most that can be done is to recognise the agendas which underpin economic advice (i.e.don’t assume it is a neutral position unless proven otherwise).

Banks: Learn to Fail

  • Another idea that bankers can borrow from engineering, or for that matter biology, is the idea of a controlled shutdown.

Traders: Why Does My Bonus Have a Minus Sign in Front?

  • The opportunity to become filthy rich is central to capitalism. And sometimes it can legitimately be at the expense of others, as in when you sell them something they actually want.
  • But, just as in modern portfolio theory, that filthy rich return should be accompanied by an equally large risk.
  • Traders must have skin in the game

Politicians: Crate an FAA for the Financial System

W&O note the parallels between banking and aviation; both are necessarily global, share a concern with safety, technology is important as is the trustworthiness and competence of the people involved. Aviation when viewed in comparison to banking seems to be much better at managing the risks.

Financial Modelers’ Manifesto

This document ,produced by Paul Wilmott and Emanuel Derman, is a kind of Hippocratic Oath and states in part:

  • I will remember that I didn’t make the world, and it does not satisfy my equations.
  • Though I will use models boldly to estimate ague, I will not be overly impressed by mathematics.
  • I will never sacrifice reality for elegance without explaining why I have done so.
  • Nor will I giveth people who use my model false comfort about its accuracy. Instead I will make explicit its assumptions and oversights.
  • I understand that my work may have enormous effects on society and the economy, many of them beyond my comprehension.

W&O also advocate that bankers should adopt a code of ethics. They note that bankers seem to be unique in not having a code of ethics and speculate that this may in part be due to an assumption that finance is a quantitative science based on objective facts and economic laws that leave no room or requirement for judgement or values. The reality in their view is quants and bankers have “outsourced ethical judgements to the invisible hand, or increasingly to algorithms”.