Worth Reading “The Money Formula” by Paul Wilmott and David Orrell.

The full title of this book, co-written by Paul Wilmott and David Orrell, is “The Money Formula: Dodgy Finance, Pseudo Science, and How Mathematicians Took over the Markets“. There are plenty of critiques of modelling and quantitative finance by outsiders throwing rocks but Wilmott is a quant and brings an insider’s technical knowledge to the question of what these tools can do, can’t do and perhaps most importantly should not be used to do. Consequently, the book offers a more nuanced perspective on the strengths and limitations of quantitative finance as opposed to the let’s scrap the whole thing school of thought. I have made some more detailed notes which follow the structure of the book but this post focuses on a couple of ideas I found especially interesting or useful.

I am not a quant so my comments should be read with that in mind but the core idea I took away is that, much as quants would want it otherwise, markets are not determined by fundamental laws, deterministic or probabilistic that allow risk to be measured with precision. These ideas work reasonably well within their “zone of validity” but a more complete answer (or model) has to recognise where the zones stop and uncertainty rules.  Wilmott and Orrell argue market outcomes are better thought of as the “emergent result of complex transactions”. The role of money in these emergent results is especially important, as is the capacity of models themselves to materially reshape the risk of the markets they are attempting to measure.

The Role of Money

Some quotes I have drawn from Chapter 8, will let the authors speak for themselves on the role of money …

Consider …. the nature of money. Standard economic definitions of money concentrate on its roles as a “medium of exchange,” a “store of value,” and a “unit of account.” Economists such as Paul Samuelson have focused in particular on the first, defining money as “anything that serves as a commonly accepted medium of exchange.” … ” Money is therefore not something important in itself; it is only a kind of token. The overall picture is of the economy as a giant barter system, with money acting as an inert facilitator.” (emphasis added)

“However … money is far more interesting than that, and actually harbors its own kind of lively, dualistic properties. In particular, it merges two things, number and value, which have very different properties:number lives in the abstract, virtual world of mathematics, while valued objects live in the real world. But money seems to be an active part of the system. So ignoring it misses important relationships. The tension between these contradictory aspects is what gives money its powerful and paradoxical qualities.” (Emphasis added)

The real and the virtual become blurred, in physics or in finance. And just as Newtonian theories break down in physics, so our Newtonian approach to money breaks down in economics. In particular, one consequence is that we have tended to take debt less seriously than we should. (emphasis added)

Instead of facing up to the intrinsically uncertain nature of money and the economy, relaxing some of those tidy assumptions, accepting that markets have emergent properties that resist reduction to simple laws, and building a new and more realistic theory of economics, quants instead glommed on to the idea that, when a system is unpredictable, you can just switch to making probabilistic predictions.” (emphasis added)

“The efficient market hypothesis, for example, was based on the mechanical analogy that markets are stable and perturbed randomly by the actions of atomistic individuals. This led to probabilistic risk-analysis tools such as VaR. However, in reality, the “atoms” are not independent, but are closely linked … The result is the non-equilibrium behaviour … observed in real markets. Markets are unpredictable not because they are efficient, but because of a financial version of the uncertainty principle.” (emphasis added)

 The Role of Models

Wilmott & Orrell devote a lot of attention to the ways in which models no longer just describe, but start to influence, the markets being modelled mostly by encouraging people to take on more risk based in part on a false sense of security …

“Because of the bankers’ insistence on treating complex finance as a university end-of-term exam in probability theory, many of the risks in the system are hidden. And when risks are hidden, one is led into a false sense of security. More risk is taken so that when the inevitable happens, it is worse than it could have been. Eventually the probabilities break down, disastrous events become correlated, the cascade of dominoes is triggered, and we have systemic risk …. None of this would matter if the numbers were small … but the numbers are huge” (Chapter 10 – emphasis added)

They see High Frequency Trading as the area likely to give rise to a future systemic crisis but also make a broader point about the tension between efficiency and resilience..

“With complex systems, there is usually a trade-off between efficiency and robustness …. Introducing friction into the system – for example by putting regulatory brakes on HFT – will slow the markets, but also make them more transparent and reliable. If we want a more robust and resilient system then we probably need to agree to forego some efficiency” (Chapter 10 – emphasis added)

The Laws of Finance

Wilmott and Orrell note the extent to which finance has attempted to identify laws which are analogous to the laws of physics and the ways in which these “laws” have proved to be more of a rough guide.

 “… the “law of supply and demand” …states that the market for a particular product has a certain supply, which tends to increase as the price goes up (more suppliers enter the market). There is also a certain demand for the product, which increases as the price goes down.”

“… while the supply and demand picture might capture a general fuzzy principle, it is far from being a law. For one thing, there is no such thing as a stable “demand” that we can measure independently –there are only transactions.”

“Also, the desire for a product is not independent of supply, or other factors, so it isn’t possible to think of supply and demand as two separate lines. Part of the attraction of luxury goods –or for that matter more basic things, such as housing –is exactly that their supply is limited. And when their price goes up, they are often perceived as more desirable, not less.” (emphasis added)

This example is relevant for banking systems (such as Australia) where residential mortgage lending dominates the balance sheets of the banks. Even more so given that public debate of the risk associated with housing seems often to be predicated on the economics 101 version of the laws of supply and demand.

The Power (and Danger) of Ideas

A recurring theme throughout the book is the ways in which economists and quants have borrowed ideas from physics without recognising the limitations of the analogies and assumptions they have relied on to do so. Wilmott and Orrell credit Sir Issac Newton as one of the inspirations behind Adam Smith’s idea of the “Invisible Hand” co-ordinating  the self interested actions of individuals for the good of society. When the quantum revolution saw physics embrace a probabilistic approach, economists followed.

I don’t think Wilmott and Orrell make this point directly but a recurring thought reading the book was the power of ideas to not just interpret the underlying reality but also to shape the way the economy and society develops not always for the better.

  • Economic laws that drive markets towards equilibrium as their natural state
  • The “invisible hand” operating in markets to reconcile individual self interest with optimal outcomes for society as a whole
  • The Efficient Market Hypothesis as an explanation for why markets are unpredictable

These ideas have widely influenced quantitative finance in a variety of domains and they all contribute useful insights; the key is to not lose sight of their zone of validity.

…. Finance … took exactly the wrong lesson from the quantum revolution. It held on to its Newtonian, mechanistic, symmetric picture of an intrinsically stable economy guided to equilibrium by Adam Smith’s invisible hand. But it adopted the probabilistic mathematics of stochastic calculus.” (emphasis added) Chapter 8

Where to from here?

It should be obvious by now that the authors are arguing that risk and reward cannot be reduced to hard numbers in the ways that physics has used similar principles and tools to generate practical insights into how the world works. Applying a bit of simple math in finance seems to open up the door to getting some control over an unpredictable world and, even better, to pursue optimisation strategies that allow the cognoscenti to optimise the balance between risk and reward. There is room for more complex math as well for those so inclined but the book sides with the increasingly widely held views that simple math is enough to get you into trouble and further complexity is best avoided if possible.

Wilmott and Orrell highlight mathematical biology in general and a book by Jim Murray on the topic as a source for better ways to approach many of the more difficult modelling challenges in finance and economics. They start by listing a series of phenomena in biological models that seem to be useful analogues for what happens in financial markets. They concede that a number of models used in mathematical biology that are almost all “toy” models. None of these models offer precise or determined outcomes but all can be used to explain what is happening in nature and offer insights into solutions for problems like disease control, epidemics, conservation etc.

The approach they advocate seems have a lot in common with the Agent Based Modelling approach that Andrew Haldane references (see his paper on “Tails of the Unexpected“) and that is the focus of Bookstabber’s book (“The End of Theory”).

In their words …

“Embrace the fact that the models are toy, and learn to work within any limitations.”

Focus more attention on measuring and managing resulting model risk, and less time on complicated new products.”

“… only by remaining both skeptical and agile can we learn. Keep your models simple, but remember they are just things you made up, and be ready to update them as new information comes in.”

I fear I have not done the book justice but I got a lot out of it and can recommend it highly.

 

 

The financial cycle and macroeconomics: What have we learnt? BIS Working Paper

Claudio Borio at the BIS wrote an interesting paper exploring the “financial cycle”. This post seeks to summarise the key points of the paper and draw out some implications for bank stress testing (the original paper can be found here).  The paper was published in December 2012, so its discussion of the implications for macroeconomic modelling may be dated but I believe it continues to have some useful insights for the challenges banks face in dealing with adverse economic conditions and the boundary between risk and uncertainty.

Key observations Borio makes regarding the Financial Cycle

The concept of a “business cycle”, in the sense of there being a regular occurrence of peaks and troughs in business activity, is widely known but the concept of a “financial cycle” is a distinct variation on this theme that is possibly less well understood. Borio states that there is no consensus definition but he uses the term to

“denote self-reinforcing interactions between perceptions of value and risk, attitudes towards risk and financing constraints, which translate into booms followed by busts. These interactions can amplify economic fluctuations and possibly lead to serious financial distress and economic disruption”.

This definition is closely related to the concept of “procyclicality” in the financial system and should not be confused with a generic description of cycles in economic activity and asset prices. Borio does not use these words but I have seen the term “balance sheet recession” employed to describe much the same phenomenon as Borio’s financial cycle.

Borio identifies five features that describe the Financial Cycle

  1. It is best captured by the joint behaviour of credit and property prices – these variables tend to closely co-vary, especially at low frequencies, reflecting the importance of credit in the financing of construction and the purchase of property.
  2. It is much longer, and has a much larger amplitude, than the traditional business cycle – the business cycle involves frequencies from 1 to 8 years whereas the average length of the financial cycle is longer; Borio cites a cycle length of 16 years in a study of seven industrialised economies and I have seen other studies indicating a longer cycle (with more severe impacts).
  3. It is closely associated with systemic banking crises which tend to occur close to its peak.
  4. It permits the identification of the risks of future financial crises in real time and with a good lead – Borio states that the most promising leading indicators of financial crises are based on simultaneous positive deviations of the ratio of private sector credit-to-GDP and asset prices, especially property prices, from historical norms.
  5. And it is highly dependent of the financial, monetary and real-economy policy regimes in place (e.g. financial liberalisation under Basel II, monetary policy focussed primarily on inflation targeting and globalisation in the real economy).

Macro economic modelling

Borio also argues that the conventional models used to analyse the economy are deficient because they do not capture the dynamics of the financial cycle. These extracts capture the main points of his critique:

“The notion… of financial booms followed by busts, actually predates the much more common and influential one of the business cycle …. But for most of the postwar period it fell out of favour. It featured, more or less prominently, only in the accounts of economists outside the mainstream (eg, Minsky (1982) and Kindleberger (2000)). Indeed, financial factors in general progressively disappeared from macroeconomists’ radar screen. Finance came to be seen effectively as a veil – a factor that, as a first approximation, could be ignored when seeking to understand business fluctuations … And when included at all, it would at most enhance the persistence of the impact of economic shocks that buffet the economy, delaying slightly its natural return to the steady state …”

“Economists are now trying hard to incorporate financial factors into standard macroeconomic models. However, the prevailing, in fact almost exclusive, strategy is a conservative one. It is to graft additional so-called financial “frictions” on otherwise fully well behaved equilibrium macroeconomic models, built on real-business-cycle foundations and augmented with nominal rigidities. The approach is firmly anchored in the New Keynesian Dynamic Stochastic General Equilibrium (DSGE) paradigm.”

“The purpose of this essay is to summarise what we think we have learnt about the financial cycle over the last ten years or so in order to identify the most promising way forward…. The main thesis is that …it is simply not possible to understand business fluctuations and their policy challenges without understanding the financial cycle”

There is an interesting discussion of the public policy (i.e. prudential, fiscal, monetary) associated with recognising the role of the financial cycle but I will focus on what implications this may have for bank management in general and stress testing in particular.

Insights and questions we can derive from the paper

The observation that financial crises are based on simultaneous positive deviations of the ratio of private sector credit-to-GDP and asset prices, especially property prices, from historical norms covers much the same ground as the Basel Committee’s Countercyclical Capital Buffer (CCyB) and is something banks would already monitor as part of the ICAAP. The interesting question the paper poses for me is the extent to which stress testing (and ICAAP) should focus on a “financial cycle” style disruption as opposed to a business cycle event. Even more interesting is the question of whether the higher severity of the financial cycle is simply an exogenous random variable or an endogenous factor that can be attributed to excessive credit growth. 

I think this matters because it has implications for how banks calibrate their overall risk appetite. The severity of the downturns employed in stress testing has in my experience gradually increased over successive iterations. My recollection is that this has partly been a response to prudential stress tests which were more severe in some respects than might have been determined internally. In the absence of any objective absolute measure of what was severe, it probably made sense to turn up the dial on severity in places to align as far as possible the internal benchmark scenarios with prudential benchmarks such as the “Common Scenario” APRA employs.

At the risk of a gross over simplification, I think that banks started the stress testing process looking at both moderate downturns (e.g. 7-10 year frequency and relatively short duration) and severe recessions (say a 25 year cycle though still relatively short duration downturn). Bank supervisors  in contrast have tended to focus more on severe recession and financial cycle style severity scenarios with more extended durations. Banks’s have progressively shifted their attention to scenarios that are more closely aligned to the severe recession assumed by supervisors in part because moderate recessions tend to be fairly manageable from a capital management perspective.

Why does the distinction between the business cycle and the financial cycle matter?

Business cycle fluctuations (in stress testing terms a “moderate recession”) are arguably an inherent feature of the economy that occur largely independently of the business strategy and risk appetite choices that banks make. However, Borio’s analysis suggests that the decisions that banks make (in particular the rate of growth in credit relative to growth in GDP and the extent to which the extension of bank credit contributes to inflated asset values) do contribute to the risk (i.e. probability, severity and duration) of a severe financial cycle style recession. 

Borio’s analysis also offers a way of thinking about the nature of the recovery from a recession. A moderate business cycle style recession is typically assumed to be short with a relatively quick recovery whereas financial cycle style recessions typically persist for some time. The more drawn out recovery from a financial cycle style recession can be explained by the need for borrowers to deleverage and repair their balance sheets as part of the process of addressing the structural imbalances that caused the downturn.

If the observations above are true, then they suggest a few things to consider:

  • should banks explore a more dynamic approach to risk appetite limits that incorporated the metrics identified by Borio (and also used in the calibration of the CCyB) so that the level of risk they are willing to take adjusts for where they believe they are in the state of the cycle (and which kind of cycle we are in)
  • how should banks think about these more severe financial cycle losses? Their measure of Expected Loss should clearly incorporate the losses expected from business cycle style moderate recessions occurring once every 7-10 years but it is less clear that the kinds of more severe and drawn out losses expected under a Severe Recession or Financial Cycle downturn should be part of Expected Loss.

A more dynamic approach to risk appetite get us into some interesting game theory  puzzles because a decision by one bank to pull back on risk appetite potentially allows competitors to benefit by writing more business and potentially doubly benefiting to the extent that the decision to pull back makes it safer for competitors to write the business without fear of a severe recession (in technical economist speak we have a “collective action” problem). This was similar to the problem APRA faced when it decided to impose “speed limits” on certain types of lending in 2017. The Royal Commission was not especially sympathetic to the strategic bind banks face but I suspect that APRA understand the problem.

How do shareholders think about these business and financial cycle losses? Some investors will adopt a “risk on-risk off” approach in which they attempt to predict the downturn and trade in and out based on that view, other “buy and hold” investors (especially retail) may be unable or unwilling to adopt a trading approach.

The dependence of the financial cycle on the fiscal and monetary policy regimes in place and changes in the real-economy also has potential implications for how banks think about the risk of adverse scenarios playing out. Many of the factors that Borio argues have contributed to the financial cycle (i.e. financial liberalisation under Basel II, monetary policy focussed primarily on inflation targeting and globalisation in the real economy) are reversing (regulation of banks is much more restrictive, monetary policy appears to have recognised the limitations of a narrow inflation target focus and the pace of globalisation appears to be slowing in response to a growing concern that its benefits are not shared equitably). I am not sure exactly what these changes mean other than to recognise that they should in principle have some impact. At a minimum it seems that the pace of credit expansion might be slower in the coming decades than it has in the past 30 years.

All in all, I find myself regularly revisiting this paper, referring to it or employing the distinction between the business and financial cycle. I would recommend it to anyone interested in bank capital management. 

The rise of the normal distribution

“We were all Gaussians now”

This post focuses on a joint paper written in 2012 by Andrew Haldane and Benjamin Nelson titled “Tails of the unexpected”. The topic is the normal distribution which is obviously a bit technical but the paper is still readable even if you are not deeply versed in statistics and financial modelling. The condensed quote below captures the central idea I took away from the paper.

“For almost a century, the world of economics and finance has been dominated by randomness … But as Nassim Taleb reminded us, it is possible to be Fooled by Randomness (Taleb (2001)). For Taleb, the origin of this mistake was the ubiquity in economics and finance of a particular way of describing the distribution of possible real world outcomes. For non-nerds, this distribution is often called the bell-curve. For nerds, it is the normal distribution. For nerds who like to show-off, the distribution is Gaussian.”

The idea that the normal distribution should be used with care, and sometimes not at all, when seeking to analyse economic and financial systems is not news. The paper’s discussion of why this is so is useful if you have not considered the issues before but probably does not offer much new insight if you have.

What I found most interesting was the back story behind the development of the normal distribution. In particular, the factors that Haldane and Nelson believe help explain why it came to be so widely used and misused. Reading the history reminds us of what a cool idea it must have been when it was first discovered and developed.

“By simply taking repeat samplings, the workings of an uncertain and mysterious world could seemingly be uncovered”.
“To scientists seeking to explain the world, the attraction of the normal curve was obvious. It provided a statistical map of a physical world which otherwise appeared un-navigable. It suggested regularities in random real-world data. Moreover, these patterns could be fully described by two simple metrics – mean and variance. A statistical window on the world had been opened.”
Haldane and Nelson highlight a semantic shift in the 1870’s where the term “normal” began to be independently applied to this statistical distribution. They argue that adopting this label helped embed the idea that the “normal distribution” was the “usual” outcome that one should expect to observe. 
“In the 18th century, normality had been formalised. In the 19th century, it was socialised.”
“Up until the late 19th century, no statistical tests of normality had been developed.
Having become an article of faith, it was deemed inappropriate to question the faith.
As Hacking put it, “thanks to superstition, laziness, equivocation, befuddlement with tables of numbers, dreams of social control, and propaganda from utilitarians, the law of large numbers became a synthetic a priori truth. We were all Gaussians now.”

Notwithstanding its widespread use today, in Haldane and Nelson’s account, economics and finance were not early adopters of the statistical approach to analysis but eventually become enthusiastic converts. The influence of physics on the analytical approaches employed in economics is widely recognised and Haldane cites the rise of probability based quantum physics over old school deterministic Newtonian physics as one of the factors that prompted economists to embrace probability and the normal distribution as a key tool.

” … in the early part of the 20th century, physics was in the throes of its own intellectual revolution. The emergence of quantum physics suggested that even simple systems had an irreducible random element. In physical systems, Classical determinism was steadily replaced by statistical laws. The natural world was suddenly ruled by randomness.”
“Economics followed in these footsteps, shifting from models of Classical determinism to statistical laws.”
“Whether by accident or design, finance theorists and practitioners had by the end of the 20th century evolved into fully paid-up members of the Gaussian sect.”

Assessing the Evidence

Having outlined the story behind its development and increasingly widespread use, Haldane and Nelson then turn to the weight of evidence suggesting that normality is not a good statistical description of real-world behaviour. In its place, natural and social scientists have often unearthed behaviour consistent with an alternative distribution, the so-called power law distribution.
“In consequence, Laplace’s central limit theorem may not apply to power law-distributed variables. There can be no “regression to the mean” if the mean is ill-defined and the variance unbounded. Indeed, means and variances may then tell us rather little about the statistical future. As a window on the world, they are broken”
This section of the paper probably does not introduce anything new to people who have spent any time looking at financial models. It does however beg some interesting questions. For example, to what extent bank loan losses are better described by a power law and, if so, what does this mean for the measures of expected loss that are employed in banking and prudential capital requirements; i.e. how should banks and regulators respond if “…the means and variances … tell us rather little about the statistical future”? This is particularly relevant as banks transition to Expected Loss accounting for loan losses.
We can of course estimate the mean loss under the benign part of the credit cycle but it is much harder to estimate a “through the cycle” average (or “expected” loss) because the frequency, duration and severity of the cycle downturn is hard to pin down with any precision. We can use historical evidence to get a sense of the problem; we can for example talk about moderate downturns say every 7-10 years with more severe recessions every 25-30 years and a 75 year cycle for financial crises. However the data is obviously sparse so it does not allow the kind of precision that is part and parcel of normally distributed events.

Explaining Fat Tails

The paper identifies the following drivers behind non-normal outcomes:
  • Non- Linear dynamics
  • Self organised criticality
  • Preferential attachment
  • Highly optimised tolerance
The account of why systems do not conform to the normal distribution does not offer much new but I found reading it useful for reflecting on the practical implications. One of the items they called out is competition which is typically assumed by economists to be a wholly benign force. This is generally true but Haldane and Nelson note the capacity for competition to contribute to self-organised criticality.
Competition in finance and banking can of course lead to beneficial innovation and efficiency gains but it can also contribute to progressively increased risk taking (e.g. more lax lending standards, lower margins for tail risk) thereby setting the system up to be prone to a self organised critical state. Risk based capital requirements can also contribute to self organised criticality to the extent they facilitate increased leverage and create incentives to take on tail risk.

Where Next?

Haldane and Nelson add their voice to the idea that Knight’s distinction between risk and uncertainty is a good foundation for developing better ways of dealing with a world that does not conform to the normal distribution and note the distinguishied company that have also chosen to emphasise the importance of uncertainty and the limitations of risk.
“Many of the biggest intellectual figures in 20th century economics took this distinction seriously. Indeed, they placed uncertainty centre-stage in their policy prescriptions. Keynes in the 1930s, Hayek in the 1950s and Friedman in the 1960s all emphasised the role of uncertainty, as distinct from risk, when it came to understanding economic systems. Hayek criticised economics in general, and economic policymakers in particular, for labouring under a “pretence of knowledge.”
Assuming that the uncertainty paradigm was embraced, Haldane and Nelson consider what the practical implications would be. They have a number of proposals but I will focus on these
  • agent based modelling
  • simple rather than complex
  • don’t aim to smooth out all volatility

Agent based modelling

Haldane and Nelson note that …

In response to the crisis, there has been a groundswell of recent interest in modelling economic and financial systems as complex, adaptive networks. For many years, work on agent-based modelling and complex systems has been a niche part of the economics and finance profession. The crisis has given these models a new lease of life in helping explain the discontinuities evident over recent years (for example, Kirman (2011), Haldane and May (2011))
In these frameworks, many of the core features of existing models need to be abandoned.
  • The “representative agents” conforming to simple economic laws are replaced by more complex interactions among a larger range of agents
  • The single, stationary equilibrium gives way to Lorenz-like multiple, non-stationary equilibria.
  • Linear deterministic models are usurped by non linear tipping points and phase shifts
Haldane and Nelson note that these types of systems are already being employed by physicists, sociologists, ecologists and the like. Since the paper was written (2012) we have seen some evidence that economists are experimenting with “agent based modelling”. A paper by Richard Bookstabber offers a useful outline of his efforts to apply these models and he has also written a book (“The End of Theory”) promoting this path. There is also a Bank of England paper on ABM worth looking at.
I think there is a lot of value in agent based modelling but a few things impede their wider use. One is that the models don’t offer the kinds of precision that make the DSGE and VaR models so attractive. The other is that they require a large investment of time to build and most practitioners are fully committed just keeping the existing models going. Finding the budget to pioneer an alternative path is not easy. These are not great arguments in defence of the status quo but they do reflect certain realities of the world in which people work.

Simple can be more robust than complex

Haldane and Nelson also advocate simplicity in lieu of complexity as a general rule of thumb for dealing with an uncertain world.
The reason less can be more is that complex rules are less robust to mistakes in specification. They are inherently fragile. Harry Markowitz’s mean-variance optimal portfolio model has informed millions of investment decisions over the past 50 years – but not, interestingly, his own. In retirement, Markowitz instead used a much simpler equally-weighted asset approach. This, Markowitz believed, was a more robust way of navigating the fat-tailed uncertainties of investment returns (Benartzi and Thaler (2001)).
I am not a big fan of the Leverage Ratio they cite it as one example of regulators beginning to adopt simpler approaches but the broader principle that simple is more robust than complex does ring true.
The mainstay of regulation for the past 30 years has been more complex estimates of banks’ capital ratios. These are prone to problems of highly-optimised tolerance. In part reflecting that, regulators will in future require banks to abide by a far simpler backstop measure of the leverage ratio. Like Markowitz’s retirement portfolio, this equally-weights the assets in a bank’s portfolio. Like that portfolio, it too will hopefully be more robust to fat-tailed uncertainties.
Structural separation is another simple approach to the problem of making the system more resilient
A second type of simple, yet robust, regulatory rule is to impose structural safeguards on worst-case outcomes. Technically, this goes by the name of a “minimax” strategy (Hansen and Sargent (2011)). The firebreaks introduced into some physical systems can be thought to be playing just this role. They provide a fail-safe against the risk of critical states emerging in complex systems, either in a self-organised manner or because of man-made intervention. These firebreak-type approaches are beginning to find their way into the language and practice of regulation.
And a reminder about the dangers of over engineering
Finally, in an uncertain world, fine-tuned policy responses can sometimes come at a potentially considerable cost. Complex intervention rules may simply add to existing uncertainties in the system. This is in many ways an old Hayekian lesson about the pretence of knowledge, combined with an old Friedman lesson about the avoidance of policy harm. It has relevance to the (complex, fine-tuned) regulatory environment which has emerged over the past few years.
While we can debate the precise way to achieve simplicity, the basic idea does in my view have a lot of potential to improve the management of risk in general and bank capital in particular. Complex intervention rules may simply add to existing uncertainties in the system and the current formulation of how the Capital Conservation Ratio interacts with the Capital Conservation Buffer is a case in point. These two elements of the capital adequacy framework define what percentage of a bank’s earnings must be retained if the capital adequacy ratio is under stress.
In theory the calculation should be simple and intuitive but anyone who has had to model how these rules work under a stress scenario will know how complex and unintuitive the calculation actually is. The reasons why this is so are probably a bit too much detail for today but I will try to pick this topic up in a future post.

Don’t aim to eliminate volatility

Systems which are adapted to volatility will tend to be stronger than systems that are sheltered from it, or in the words of Haldane and Nelson …

“And the argument can be taken one step further. Attempts to fine-tune risk control may add to the probability of fat-tailed catastrophes. Constraining small bumps in the road may make a system, in particular a social system, more prone to systemic collapse. Why? Because if instead of being released in small bursts pressures are constrained and accumulate beneath the surface, they risk an eventual volcanic eruption.”

I am a big fan of this idea. Nassim Taleb makes a similar argument in his book “Antifragile” as does Greg Ip in “Foolproof”. It also reflects Nietzsche’s somewhat more poetic dictum “that which does not kills us makes us stronger”.

In conclusion

If you have read this far then thank you. I hope you found it useful and interesting. If you want to delve deeper then you can find my more detailed summary and comments on the paper here. If you think I have any of the above wrong then please let me know.

Swiss money experiment

Last month I posted a review of Mervyn King’s book “The end of Alchemy”. One of the central ideas in King’s book was that all deposits must be backed 100% by liquid, safe assets. It appears that the Swiss are being asked to vote on a proposal labeled “Sovereign Money Initiative” that may not be exactly the same as King’s idea but comes from the same school of money philosophy.

It is not clear that there is any popular support for the proposal but it would be a fascinating money experiment if it did get support. Thanks to Brian Reid for flagging this one to me.

Tony

 

 

Looking under the hood – The IRB formula

This post is irredeemably technical so stop here if that is not your interest. If you need to understand some of the mechanics of the formula used to calculate credit risk weighted assets under the advanced Internal Ratings Based (IRB) approach then the BCBS published a paper in 2005 which offers an explanation:

  • describing the economic foundations
  • as well as the underlying mathematical model and its input parameters.

While a lot has changed as a result of Basel III, the models underlying the calculation of Internal Rating Based Capital (IRB) requirements are still based on the core principles agreed under Basel II that are explained in this BCBS paper.

The notes in the linked page below mostly summarise the July 2005 paper with some emphasis (bolded text) and comments (in italics) that I have added. The paper is a bit technical but worth reading if you want to understand the original thinking behind the Basel II risk weights for credit risk.

I initially found the paper useful for revisiting the foundation assumptions of the IRB framework as background to considering the regulatory treatment of Expected Loss as banks transition to IFRS9. The background on how the RW was initially intended to cover both Expected and Unexpected Loss, but was revised such that capital was only required to cover Unexpected Loss, is especially useful when considering the interaction of loan loss provisioning with capital requirements.

Reading the BCBS paper has also been useful for thinking through a range of related issues including:

  • The rationale for, and impact of, prudential conservatism in setting the risk parameters used in the IRB formula
  • The cyclicality of a risk sensitive capital requirement (and potential for pro cyclicality) and what might be done to mitigate the risk of pro-cyclical impacts on the economy

If you have read this far then my summary of the BCBS paper and my comments /observations can be found here (and thank you).

I am not a credit risk model expert, so the summary of the paper and my comments must be read with that in mind. I did this to help me think through some of the issues with bank capital adequacy. Hopefully others will find the notes useful. If you see something wrong or something you disagree with then let me know.

“The Great Divide” by Andrew Haldane

This speech by Andrew Haldane (Chief Economist at the Bank of England) was given in 2016 but is sill worth reading for anyone interested in the question of what role banks play in society and why their reputation is not what it once was. Some of my long term correspondents will be familiar with the paper and may have seen an earlier draft of this post.

“The Great Divide” refers to a gap between how banks perceive themselves and how they are perceived by the community. Haldane references a survey the BOE conducted in which the most common word used by banks to describe themselves was “regulated” while “corrupt” was the community choice closely followed by “manipulated”, “self-serving”, “destructive” and “greedy”. There is an interesting “word cloud” chart in the paper representing this gap in perception.

While the focus is on banks, Haldane makes the point that the gap in perceptions reflects a broader tension between the “elites” and the common people. He does not make this explicit connection but it seemed to me that the “great divide” he was referencing could also be argued to be manifesting itself in the increasing support for populist political figures purporting to represent the interests of the common people against career politicians. This broader “great divide” idea seemed to me to offer a useful framework for thinking about the challenges the banking industry is facing in rebuilding trust.

Haldane uses this “great divide” as a reference for discussing

  • The crucial role finance plays in society
  • The progress made so far in restoring trust in finance
  • What more needs to be done

The crucial role finance plays in society

Haldane argues that closing the trust deficit between banks and society matters for two reasons

  • because a well functioning financial system is an essential foundation for a growing and well functioning economy – to quote Haldane “that is not an ideological assertion from the financial elite; it is an empirical fact”
  • but also because the downside of a poorly functioning financial system is so large

Haldane uses the GFC to illustrate the downside in terms of the destruction of the value of financial capital and physical capital but he introduces a third form of capital, “social capital” that he argues may matter every bit as much to the wealth and well being of society. He defines social capital as the “relationships, trust and co-operation forged between different groups of people over time. It is the sociological glue that binds diverse societies into a cohesive whole”. The concept of “trust” is at the heart of Haldane’s definition of social capital.

Haldane cites evidence that trust plays an important role at both the micro and macro level in value creation and growth and concludes that “… a lack of trust jeopardises one of finance’s key societal functions – higher growth”.

In discussing these trends, Haldane distinguishes “personalised trust” and “generalised trust“. The former refers to mutual co-operation built up through repeated personal interactions (Haldane cites example like visits to the doctor or hairdresser) while the latter is attached to an identifiable but anonymous group (Haldane cites trust in the rule of law, or government or Father Christmas).

He uses this distinction to explore why banks have lost the trust of the community;

He notes that banking was for most of its history a relationship based business. The business model was not perfect but it did deliver repeated interactions with customers that imbued banking with personalised trust. At the same time its “mystique” (Haldane’s term) meant that banking maintained a high degree of generalised trust as well.

He cites the reduction in local branches, a common strategy pre GFC, as one of the changes that delivered lower costs but reduced personal connections thereby contributing to reducing personalised trust. For a while, the banking system could reap the efficiency gains while still relying on generalised trust but the GFC subsequently undermined the generalised trust in the banking system. This generalised trust has been further eroded by the continued run of banking scandals that convey the sense that banks do not care about their customers.

What can be done to restore trust in finance

He notes the role that higher capital and liquidity have played but that this is not enough in his view. He proposes three paths

  1. Enhanced public education
  2. Creating “Purpose” in banking
  3. Communicating “Purpose” in banking

Regarding public education, there is a telling personal anecdote he offers on his experience with pensions. He describes himself as “moderately financially literate” but follows with “Yet I confess to not being able to make the remotest sense of pensions. Conversations with countless experts and independent financial advisors have confirmed for me only one thing – that they have no clue either”. This may be dismissed as hyperbole but it does highlight that most people will be less financially literate than Haldane and are probably poorly equipped to deal with the financial choices they are required to make in modern society. I am not sure that education is the whole solution.

Regarding “purpose” Haldane’s main point seems to be that there is too much emphasis on shareholder value maximisation and not enough balance. This seems to be an issue that is amplified by the UK Companies Act that requires that directors place shareholder interests as their primary objective. To the best of my knowledge, the Australian law does not have an equivalent explicit requirement to put shareholders first but we do grapple with the same underlying problem. Two of my recent posts (“The World’s Dumbest Idea” and “The Moral Economy” touch on this issue.

Regarding communicating purpose, Haldane cites some interesting evidence that the volume of information provided by companies is working at cross purposes with actual communication with stakeholders. Haldane does not make the explicit link but Pillar 3 clearly increases the volume of information provided by banks. The points raised by Haldane imply (to me at least) that Pillar 3 might actually be getting in the way of communicating clearly with stakeholders.

This is a longish post but I think there is quite a lot of useful content in the speech so I would recommend it.

Recently read – “The Moral Economy: Why Good Incentives Are No Substitute For Good Citizens” by Samuel Bowles

The potential for incentives to create bad behaviour has been much discussed in the wake of the GFC while the Financial Services Royal Commission in Australia has provided a fresh set of examples of bankers behaving badly. It is tempting of course to conclude that bankers are just morally corrupt but, for anyone who wants to dig deeper, this book offers an interesting perspective on the role of incentives in the economy.

What I found especially interesting is Bowles account of the history of how the idea that good institutions and a free market based economy could “harness self interest to the public good” has come to dominate so much of current economic and public policy. Building on this foundation, the book examines the ways in which incentives designed around the premise that people are solely motivated by self interest can often be counter-productive; either by crowding out desirable behaviour or by prompting people to behave in ways that are the direct opposite of what was intended.

Many parts of this story are familiar but it was interesting to see how Bowles charted the development of the idea over many centuries and individual contributors. People will no doubt be familiar with Adam Smith’s “Invisible Hand”  but Bowles also introduces other thinkers who contributed to this conceptual framework, Machiavelli and David Hume in particular. The idea is neatly captured in this quote from Hume’s Essays: Moral, Political and Literary (1742) in which he recommended the following maxim

“In contriving any system of government … every man ought to be supposed to be a knave and to have no other end … than private interest. By this interest we must govern him, and, by means of it, make him notwithstanding his insatiable avarice and ambition, cooperate to public good” .

Bowles makes clear that this did not mean that people are in fact solely motivated by self-interest (i.e “knaves”), simply that civic virtue (i.e. creating good people) by itself was not a robust platform for achieving good outcomes. The pursuit of self interest, in contrast, came to be seen as a benign activity that could be harnessed for a higher purpose.

The idea of embracing self-interest is of course anathema to many people but its intellectual appeal is I think obvious.  Australian readers at this point might be reminded of Jack Lang’s maxim “In the race of life, always back self-interest; at least you know it’s trying“. Gordon Gekko’s embrace of the principle that “Greed is good” is the modern expression of this intellectual tradition.

Harnessing self-interest for the common good

Political philosophers had for centuries focused on the question of how to promote civic virtue but their attention turned to finding laws and other public policies that would allow people to pursue their personal objectives, while also inducing them to take account of the effects of their actions on others. The conceptual foundations laid down by David Hume and Adam Smith were progressively built on with competition and well defined property rights coming to be seen as important parts of the solution.

“Good institutions displaced good citizens as the sine qua non of good government. In the economy, prices would do the work of morals”

“Markets thus achieved a kind of moral extraterritoriality … and so avarice, repackaged as self-interest, was tamed, transformed from a moral failing to just another kind of motive”

Free market determined prices were at the heart of the system that allowed the Invisible Hand to work its magic but economists recognised that competition alone was not sufficient for market prices to capture everything that mattered. For the market to arrive at the right (or most complete) price, it was also necessary that economic interactions be governed by “complete contracts” (i.e. contracts that specify the rights and duties of the buyer and seller in all future states of the world).

This is obviously an unrealistic assumption. Apart from the difficulty of imagining all future states of the world, not everything of value can be priced. But all was not lost. Bowles introduces Alfred Marshall and Arthur Pigou who identified, in principle, how a system of taxes and subsidies could be devised that compensated economic actors for benefits their actions conferred on others and made them liable for costs they imposed on others.

These taxes and subsidies are of course not always successful and Bowles offers a taxonomy of reasons why this is so. Incentives can work but not, according to Bowles, if they simplistically assume that the target of the incentive cares only about his or her material gain. To be effective, incentives must account for the fact that people are much more complex, social and moral than is strictly rational from an economic perspective. Bowles devotes a lot of the book to the problem with incentives (both positive and negative, including taxes, fines, subsidies, bonuses etc) which he categorises under three headings:

  1. “Bad News“; incentives send a signal and the tendency is for people to read things into incentives which may not have been intended but prompt them to respond negatively (e.g. does this incentive signal that the other party believes I am not trustworthy or lazy)
  2. Moral Disengagement”; the incentive may create a context in which the subject can distance themselves from the moral consequences of how they respond
  3. “Control Aversion”; an incentive that compromises a subject’s sense of autonomy or pride in the task may reduce their intrinsic motivation to perform the task well

Having noted the ways that incentives can have adverse impacts on behaviour, Bowles notes that civic minded values continue to be an important feature of market based economies and examines why this might be.

“If incentives sometimes crowd out ethical reasoning, the desire to help others, and intrinsic motivations, and if leading thinkers celebrate markets as a morality-free zone, it seems just a short step to Karl Marx’s broadside condemnation of capitalist culture”

One answer is that trading in markets encourages people to trust strangers and that the benefits of trading over time teach people that trust is a valuable commodity (the so called “doux commerce” theory).

While admitting his answer is speculative, Bowles rejects “doux commerce” as the whole answer. He argues that the institutions (property rights, rule of law, etc) developed by liberal societies to protect citizens from worst-case outcomes such as personal injury, loss of property, and other calamities make the consequences of mistakenly trusting a defector much less dire. As a result, the rule of law lowers the bar for how much you would have to know about your partner before trusting him or her, thereby promoting the spread of trusting expectations and hence of trusting behavior in a population.

The “institutional structure” theory is interesting but there is still much in the book worth considering even if you don’t buy his explanation. I have some more detailed notes on the book here.

Lessons for banking in Pixar’s approach to dealing with uncertainty and the risk of failure.

The report on the Prudential Inquiry into the CBA (“CBA Report”) is obviously required reading in banking circles this week. Plenty has been written on the topic already so I will try to restrain myself unless I can find something new to add to the commentary. However, while reading the report, I found myself drawing links to books that I think bankers would find well worth reading. These include Foolproof (by Michael Ip) and “The Success Equation: Untangling Skill and Luck in Business, Sports and Investing (by Michael Mauboussin).

I have put up some notes on Foolproof here and intend to do the same for The Success Equation sometime soon. The focus for today’s post however is a book titled “Creativity, Inc” by Ed Catmull who founded and led Pixar. The overall theme of the book is about developing and sustaining a creative culture but dealing with risk and uncertainty emerges as a big part of this.

What does making movies have to do with banking?

One of the lessons Catmull emphasised was that, notwithstanding Pixar’s success, it was important not to lose sight of the role that random factors play in both success and failure. A quote from Ch 8 illustrates this point;

“… a lot of our success came because we had pure intentions and great talent, and we did a lot of things right, but I also believe that attributing our success solely to our own intelligence without acknowledging the role of accidental events, diminishes us.”

He goes on to describe how success can be a trap for the following reasons;

  • it creates the impression that what you are doing must be right,
  • it tempts you to overlook hidden problems and
  • you may be confusing luck with skill.

There is a discussion in Ch 9 of the kinds of things that can lead you to misunderstand the real nature of both your success and your failure. These include various cognitive biases (such as “confirmation” where you weight information that supports what you believe more than the counter evidence) and mental models we use to simplify the world in which we operate. These are hard wired into us so the best we can do is be aware of how these things can take us off track; that at least puts us ahead of those who blindly follow their mental models and biases.

His answer to building the capacity to adapt to change and respond to setbacks is to trust in people but trust does not mean you trust that people won’t make mistakes. Catmull accepts setbacks and screw ups as an inevitable part of being creative and innovative but trust is demonstrated when you support your people when they do screw up and trust them to find the solution.

This is interesting because the CBA Report indicates that CBA did in fact place a great deal of trust in their executive team and senior leaders, which implies trust alone is not enough. The missing ingredients in CBA’S case were accountability and consequence when the team failed to identify, escalate and resolve problems.

The other interesting line of speculation is whether CBA’s risk culture might have benefited from a deeper reflection on the difference between skill and luck. Maboussin’s book (The Success Equation) is particularly good in the way in which he lays out his framework for making this distinction.

I plan to come back to this topic once I have completed a review of Maboussin’s book but in the interim I can recommend all of the books mentioned in this post.

“Between Debt and the Devil: Money, Credit and Fixing Global Finance” by Adair Turner (2015)

This book is worth reading, if only because it challenges a number of preconceptions that bankers may have about the value of what they do. The book also benefits from the fact that author was the head of the UK Financial Services Authority during the GFC and thus had a unique inside perspective from which to observe what was wrong with the system. Since leaving the FSA, Turner has reflected deeply on the relationship between money, credit and the real economy and argues that, notwithstanding the scale of change flowing from Basel III, more fundamental change is required to avoid a repeat of the cycle of financial crises.

Overview of the book’s main arguments and conclusions

Turner’s core argument is that increasing financial intensity, represented by credit growing faster than nominal GDP, is a recipe for recurring bouts of financial instability.

Turner builds his argument by first considering the conventional wisdom guiding much of bank prudential regulation prior to GFC, which he summarises as follows:

  • Increasing financial activity, innovation and “financial deepening” were beneficial forces to be encouraged
  • More compete and liquid markets were believed to ensure more efficient allocation of capital thereby fostering higher productivity
  • Financial innovations made it easier to provide credit to households and companies thereby enabling more rapid economic growth
  • More sophisticated risk measurement and control meanwhile ensured that the increased complexity of the financial system was not achieved at the expense of stability
  • New systems of originating and distributing credit, rather than holding it on bank balance sheets, were believed to disperse risks into the hands of those best placed to price and manage it

Some elements of Turner’s account of why this conventional wisdom was wrong do not add much to previous analysis of the GFC. He notes, for example, the conflation of the concepts of risk and uncertainty that weakened the risk measurement models the system relied on and concludes that risk based capital requirements should be foregone in favour of a very high leverage ratio requirement. However, in contrast to other commentators who attribute much of the blame to the moral failings of bankers, Turner argues that this is a distraction. While problems with the way that bankers are paid need to be addressed, Turner argues that the fundamental problem is that:

  • modern financial systems left to themselves inevitably create debt in excessive quantities,
  • in particular, the system tends to create debt that does not fund new capital investment but rather the purchase of already existing assets, above all real estate.

Turner argues that the expansion of debt funding the purchase or trading of existing assets drives financial booms and busts, while the debt overhang left over by the boom explains why financial recovery from a financial crisis is typically anaemic and protracted. Much of this analysis seems to be similar to ideas developed by Hyman Minsky while the slow pace of recovery in the aftermath of the GFC reflects a theme that Reinhart and Rogoff have observed in their book titled “This time is different” which analyses financial crises over many centuries.

The answer, Turner argues, is to build a less credit intensive growth model. In pursuing this goal, Turner argues that we also need to understand and respond to the implications of three underlying drivers of increasing credit intensity;

  1. the increasing importance of real estate in modern economies,
  2. increasing inequality, and
  3. global current account imbalances.

Turner covers a lot of ground, and I do not necessarily agree with everything in his book, but I do believe his analysis of what is wrong with the system is worth reading.

Let me start with an argument I do not find compelling; i.e. that risk based capital requirements are unreliable because they are based on a fundamental misunderstanding of the difference between risk (which can be measured) and uncertainty (which cannot):

  • Distinguishing between risk and uncertainty is clearly a fundamental part of understanding risk and Turner is not alone in emphasising its importance
  • I believe that means that we should treat risk based capital requirements with a healthy degree of scepticism and a clear sense of their limitations but that does not render them entirely unreliable especially when we are using them to understand relative differences in risk and to calibrate capital buffers
  • The obvious problem with non-risk based capital requirements is that they create incentives for banks to take higher risk that may eventually offset the supposed increase in soundness attached to the higher capital
  • It may be that Turner discounts this concern because he envisages a lower credit growth/intensity economy delivering less overall systemic risk or because he envisages a more active role for the public sector in what kinds of assets banks lend against; i.e. his support for higher capital may stem mostly from the fact that this reduces the capacity of private banks to generate credit growth

While advocating much higher capital, Turner does seem to part company with M&M purists by expressing doubt that equity investors will be willing to accept deleveraged returns. His reasoning is that returns to equity investments need a certain threshold return to be “equity like” while massively deleveraged ROE still contains downside risks that are unacceptable to debt investors.

Turning to the arguments which I think raise very valid concerns and deserve serious attention.

Notwithstanding my skepticism regarding a leverage ratio as the solution, the arguments he makes about the dangers of excessive credit growth resonate very strongly with what I learned during my banking career. Turner is particularly focussed on the downsides of applying excessive debt to the financing of existing assets, real estate in particular. The argument seems to be similar to (if not based on) the work of Hyman Minsky.

Turner’s description of the amount of money that banks can create as being “infinitely elastic” seems an overstatement to me (especially in the Australian context with the Net Stable Funding Ratio (NSFR) weighing on the capacity to grow the balance sheet) but the general point he is making about the way that credit fuelled demand for a relatively inelastic supply of desirable residential property tends to result in inflated property values with no real social value rings true.

What banks can do about this remains an open question given that resolving the problem with inelastic supply of property is outside their direct control but it is obviously important to understand the dynamics of the market underpinning their largest asset class and it may help them engage more constructively with public policy debates that seek to address the problem.

Turner’s analysis of the downsides of easy monetary policy (the standard response to economic instability) also rings true. He identifies the fact that lower interest rates tend to result in inflated asset values (residential property in particular given its perceived value as a safe asset) which do not address the fundamental problem of over-indebtedness and may serve to increase economic inequality. His discussion of the impact of monetary policy and easy credit on economic inequality is also interesting. The banks providing the credit in the easy money environment may not necessarily be taking undue risk and prudential supervisors have tools to ensure sound lending standards are maintained if they do believe there is a problem with asset quality. What may happen however is that the wealthier segments of society benefit the most under easy money because they have the surplus cash flow to buy property at inflated values while first homebuyers become squeezed out of the market. Again their capacity to address the problem may be limited but Turner’s analysis prompted me to reflect on what increasing economic inequality might mean for bank business models.

In addition to much higher bank capital requirements, Turner’s specific recommendations for moving towards a less credit intensive economy include:

  • Government policies related to urban development and the taxation of real estate
  • Changing tax regimes to reduce the current bias in favour of debt over equity financing (note that Australia is one of the few countries with a dividend imputation system that does reduce the bias to debt over equity)
  • Broader macro prudential powers for central banks, including the power to impose much larger countercyclical capital requirements
  • Tough constraints on the ability of the shadow banking system to create credit and money equivalents
  • Using public policy to produce different allocations of capital than would result from purely market based decisions; in particular, deliberately leaning against the market signal based bias towards real estate and instead favouring other “potentially more socially valuable forms of credit allocation”
  • Recognising that the traditional easy monetary policy response to an economic downturn (or ultra-easy in the case of a financial crisis such as the GFC) is better than doing nothing but comes at a cost of reigniting the growth in private credit that generated the initial problem, creating incentives for risky financial engineering and exacerbating economic inequality via inflating asset prices.

For those who want to dig deeper, I have gone into a bit more detail here on what Turner has to say about the following topics:

  • The way in which inefficient and irrational markets leave the financial system prone to booms and busts
  • The dangers of debt contracts sets out how certain features of these contracts increase the risk of instability and hamper the recovery
  • Too much of the wrong sort of debt describes features of the real estate market that make it different from other asset classes
  • Liberalisation, innovation and the credit cycle on steroids recaps on the philosophy that drove the deregulation of financial markets and what Turner believes to be the fundamental flaws with that approach. In particular his conclusion that the amount of credit created and its allocation is “… too important to be left to bankers…”
  • Private credit and money creation offers an outline of how bank deposits evolved to play an increasing role (the key point being that it was a process of evolution rather than overt public policy design choices)
  • Credit financed speculation discusses the ways in which credit in modern economies tends to be used to finance the purchase of existing assets, in particular real estate, and the issues that flow from this.
  • Inequality, credit and more inequality sets out some ways in which the extension of credit can contribute to increasing economic inequality
  • Capital requirements sets out why Turner believes capital requirements should be significantly increased and why capital requirements (i.e. risk weights) for some asset classes (e.g. real estate) should be be calibrated to reflect the social risk of the activity and not just private risks captured by bank risk models
  • Turner defence against the argument that his proposals are anti-markets and anti-growth.

“The End of Alchemy” by Mervyn King

Anyone interested in the conceptual foundations of money and banking will I think find this book interesting. King argues that the significant enhancements to capital and liquidity requirements implemented since the GFC are not sufficient because of what he deems to be fundamental design flaws in the modern system of money and banking.

King is concerned with the process by which bank lending creates money in the form of bank deposits and with the process of maturity transformation in banking under which long term, illiquid assets are funded to varying degrees by short term liabilities including deposits. King applies the term “alchemy” to these processes to convey the sense that the value created is not real on a risk adjusted basis.

He concedes that there will be a price to pay in foregoing the “efficiency benefits of financial intermediation” but argues that these benefits come at the cost of a system that:

  • is inherently prone to banking crises because, even post Basel III, it is supported by too little equity and too little liquidity, and
  • can only be sustained in the long run by the willingness of the official sector to provide Lender of Last Resort liquidity support.

King’s radical solution is that all deposits must be 100% backed by liquid reserves which would be limited to safe assets such as government securities or reserves held with the central bank. King argues that this removes the risk/incentive for bank runs and for those with an interest in Economic History he acknowledges that this idea originated with “many of the most distinguished economists of the first half the twentieth century” who proposed an end to fractional reserve banking under a proposal that was known as the “Chicago Plan”. Since deposits are backed by safe assets, it follows that all other assets (i.e. loans to the private sector) must be financed by equity or long term debt

The intended result is to separate

  • safe, liquid “narrow” banks issuing deposits and carrying out payment services
  • from risky, illiquid “wide” banks performing all other activities.

At this point, King notes that the government could in theory simply stand back and allow the risk of unexpected events to impact the value of the equity and liabilities of the banks but he does not advocate this. This is partly because volatility of this nature can undermine consumer confidence but also because banks may be forced to reduce their lending in ways that have a negative impact on economic activity. So some form of central bank liquidity support remains necessary.

King’s proposed approach to central bank liquidity support is what he colloquially refers to as a “pawnbroker for all seasons” under which the  central bank agrees up front how much it will lend each bank against the collateral the bank can offer;

King argues that

“almost all existing prudential capital and liquidity regulation, other than a limit on leverage, could be replaced by this one simple rule”.

which “… would act as a form of mandatory insurance so that in the event of a crisis a central bank would be free to lend on terms already agreed and without the necessity of a penalty rate on its loans. The penalty, or price of the insurance, would be encapsulated by the haircuts required by the central bank on different forms of collateral”

leaving banks “… free to decide on the composition of their assets and liabilities… all subject to the constraint that alchemy in the private sector is eliminated”

Underpinning King’s thesis are four concepts that appear repeatedly

  • Disequilibrium; King explores ways in which economic disequilibrium repeatedly builds up followed by disruptive change as the economy rebalances
  • Radical uncertainty; this is the term he applies to Knight’s concept of uncertainty as distinct from risk. He uses this to argue that any risk based approach to capital adequacy is not built on sound foundations because it will not capture the uncertain dimension of unexpected loss that we should be really concerned with
  • The “prisoner’s dilemma” to illustrate the difficulty of achieving the best outcome when there are obstacles to cooperation
  • Trust; he sees trust as the key ingredient that makes a market economy work but also highlights how fragile that trust can be.

My thoughts on King’s observations and arguments

Given that King headed the Bank of England during the GFC, and was directly involved in the revised capital and liquidity rules (Basel III) that were created in response, his opinions should be taken seriously. It is particularly interesting that, notwithstanding his role in the creation of Basel III, he argues that a much more radical solution is required.

I think King is right in pointing out that the banking system ultimately relies on trust and that this reliance in part explains why the system is fragile. Trust can and does disappear, sometimes for valid reasons but sometimes because fear simply takes over even when there is no real foundation for doubting the solvency of the banking system. I think he is also correct in pointing out that a banking system based on maturity transformation is inherently illiquid and the only way to achieve 100% certainty of liquidity is to have one class of safe, liquid “narrow” banks issuing deposits and another class of risky, illiquid institution he labels “wide” banks providing funding on a maturity match funded basis. This second class of funding institution would arguably not be a bank if we reserve that term for institutions which have the right to issue “bank deposits”.

King’s explanation of the way bank lending under the fractional reserve banking system creates money covers a very important aspect of how the modern banking and finance system operates. This is a bit technical but I think it is worth understanding because of the way it underpins and shapes so much of the operation of the economy. In particular, it challenges the conventional thinking that banks simply mobilise deposits. King explains how banks do more than just mobilise a fixed pool of deposits, the process of lending in fact creates new deposits which add to the money supply. For those interested in understanding this in more depth, the Bank of England published a short article in its Quarterly Bulletin (Q1 2014) that you can find at the following link

He is also correct, I think, in highlighting the limits of what risk based capital can achieve in the face of “radical uncertainty” but I don’t buy his proposal that the leverage ratio is the solution. He claims that his “pawnbroker for all seasons” approach is different from the standardised approach to capital adequacy but I must confess I can’t see that the approaches are that different. So even if you accept his argument that internal models are not a sound basis for regulatory capital, I would still argue that a revised and well calibrated standardised approach will always be better than a leverage ratio.

King’s treatment of the “Prisoner’s Dilemma” in money and banking is particularly interesting because it sets out a conceptual rationale for why markets will not always produce optimal outcomes when there are obstacles to cooperation. This brings to mind Chuck Prince’s infamous statement about being forced to “keep dancing while the music is playing” and offers a rationale for the role of regulation in helping institutions avoid situations in which competition impedes the ability of institutions to avoid taking excessive risk. This challenges the view that market discipline would be sufficient to keep risk taking in check. It also offers a different perspective on the role of competition in banking which is sometimes seen by economists as a panacea for all ills.

I have also attached a link to a review of King’s book by Paul Krugman