Canada innovates in the capital buffer space

The Canadian prudential regulator (OFSI) has made an interesting contribution to the capital buffer space via its introduction of a Domestic Stability Buffer (DSB).

Key features of the Domestic Stability Buffer:

  • Applies only to Domestic Systemically Important Banks (D-SIB) and intended to cover a range of systemic vulnerabilities not captured by the Pillar 1 requirement
  • Vulnerabilities currently included in the buffer include (i) Canadian consumer indebtedness; (ii) asset imbalances in the Canadian market and (iii) Canadian institutional indebtedness
  • Replaces a previously undisclosed Pillar 2 loading associated with this class of risks (individual banks may still be required to hold a Pillar 2 buffer for idiosyncratic risks)
  • Initially set at 1.5% of Total RWA and will be in the range of 0 to 2.5%
  • Reviewed semi annually (June and December); with the option to change more frequently in exceptional circumstances
  • Increases phased in while decreases take effect immediately

Implications for capital planning:

  • DSB supplements the Pillar 1 buffers (Capital Conservation Buffer, D-SIB surcharge and the Countercyclical Buffer)
  • Consequently, the DSB will not result in banks being subject to the automatic constraints on capital distributions that are applied by the Pillar 1 buffers
  • Banks will be required to disclose that the buffer has been breached and the OFSI will require a remediation plan to restore the buffer

What is interesting:

  • The OFSI argues that translating the existing Pillar 2 requirement into an explicit buffer offers greater transparency which in turn “… will support banks’ ability to use this capital buffer in times of stress by increasing the market’s understanding of the purpose of the buffer and how it should be used”
  • I buy the OFSI rationale for why an explicit buffer with a clear narrative is a more usable capital tool than an undisclosed Pillar 2 requirement with the same underlying rationale
  • The OFSI retains a separate Countercyclical Buffer but this Domestic Stability Buffer seems similar but not identical in its over-riding purpose (to me at least) to the approach that the Bank of England (BoE) has adopted for managing the Countercyclical Buffer.
  • A distinguishing feature of both the BoE and OFSI approaches is linking the buffer to a simple, coherent narrative that makes the buffer more usable by virtue of creating clear expectations of the conditions under which the buffer can be used.

Bottom line is that I see useful features in both the BoE and OFSI approach to dealing with the inherent cyclicality of banking.  I don’t see  either of the proposals doing much to mitigate the cyclicality of banking but I do see them offering more potential for managing the consequences of that cyclicality. Both approaches seem to me to offer material improvements over the Countercyclical Buffer as originally conceived by the BCBS.

It will be interesting to see if APRA chooses to adapt elements of this counter cyclical approach to bank capital requirements.

If I am missing something, please let me know …

From the Outside

The answer is more loan loss provisions, what was the question?

I had been intending to write a post on the potential time bomb for bank capital embedded in IFSR9 but Adrian Docherty has saved me the trouble. He recently released an update on IFRS9 and CECL titled Much Ado About Nothing or Après Moi. Le Deluge?

This post is fairly technical so feel free to stop here if you are not a bank capital nerd. However, if you happen to read someone saying that IFRS 9 solves one of the big problems encountered by banks during the GFC then be very sceptical. Adrian (and I) believe that is very far from the truth. For those not discouraged by the technical warning, please read on.

The short version of Adrian’s note is:

  • The one-off transition impact of the new standard is immaterial and the market has  largely ignored it
  • Market apathy will persist until stressed provisions are observed
  • The dangers of ECL provisioning (procyclical volatility, complexity and subjectivity) have been confirmed by the authorities …
  • … but criticism of IFRS 9 is politically incorrect since the “correct” narrative is that earlier loan loss provisioning fulfils the G20 mandate to address the problem encountered during the GFC
  • Regulatory adaption has been limited to transition rules, which are not a solution. We need a fundamentally revised Basel regime – “Basel V” – in which lifetime ECL provisions somehow offset regulatory capital requirements.

Adrian quotes at length from Bank of England (BoE) commentary on IFRS 9. He notes that their policy intention is that the loss absorbing capacity of the banking system is not impacted by the change in accounting standards but he takes issue with the way that they have chosen to implement this policy approach. He also calls out the problem with the BoE instruction that banks should assume “perfect foresight” in their stress test calculations.

Adrian also offers a very useful deconstruction of what the European Systemic Risk Board had to say in a report they published in July 2017 . He has created a table in which he sets out what the report says on one column and what they mean in another (see page 8 of Adrian’s note).

This extract from Adrian’s note calls into question whether the solution developed is actually what the G20 asked for …

“In official documents, the authorities still cling to the assertion that ECL provisioning is good for financial stability “if soundly implemented” or “if properly applied”. They claim that the new standard “means that provisions for potential credit losses will be made in a timely way”. But what they want is contrarian, anti-cyclical ECL provisioning. This is simply not possible, in part because of human psychology but, more importantly, because the standard requires justifiable projections based on objective, consensual evidence.

Surely the authorities know they are wrong? Their arguments don’t stack up.

They hide behind repeated statements that the G20 instructed them to deliver ECL provisioning, whereas a re-read of the actual instructions clearly shows that a procyclical, subjective and complex regime was not what was asked for.

It just doesn’t add up.”

There is of course no going back at this point, so Adrian (rightly I think) argues that the solution lies in a change to banking regulation to make Basel compatible with ECL provisioning. I will quote Adrian at length here

 “So the real target is to change banking regulation, to make Basel compatible with ECL provisioning. Doing this properly would constitute a genuine “Basel V”. Yes, the markets would still need to grapple with complex and misleading IFRS 9 numbers to assess performance. But if the solvency calculation could somehow adjust properly for ECL provisions, then solvency would be stronger and less volatile.

And, in an existential way, solvency is what really matters – it’s the sina qua non  of a bank. Regulatory solvency drives the ability of a bank to grow the business and distribute capital. Accounting profit matters less than the generation of genuinely surplus solvency capital resources.

Basel V should remove or resolve the double count between lifetime ECL provisions and one-year unexpected loss (UL) capital resources. There are many different ways of doing this, for example:

A. Treat “excess provisions” (the difference between one-year ECL and lifetime ECL for Stage 2 loans) as CET1

B. Incorporate expected future margin as a positive asset, offsetting the impact of expected future credit losses

C. Reduce capital requirements by the amount of “excess provisions” (again, the difference between one-year ECL and lifetime ECL for Stage 2 loans) maybe with a floor at zero

D. Reduce minimum regulatory solvency ratios for banks with ECL provisioning (say, replacing the Basel 8% minimum capital ratio requirement to 4%)

All of these seem unpalatable at first sight! To get the right answer, there is a need to conduct a fundamental rethink. Sadly, there is no evidence that this process has started. The last time that there was good thinking on the nature of capital from Basel was some 17 years ago. It’s worth re-reading old papers to remind oneself of the interaction between expected loss, unexpected loss and income.  The Basel capital construct needs to be rebuilt to take into account the drastically different meaning of the new, post-IFRS 9 accounting equity number.”

Hopefully this post will encourage you to read Adrian’s note and to recognise that IFRS 9 is not the cycle mitigating saviour of banking it is represented to be. The core problem is not so much with IFRS9 itself (though its complexity and subjectivity are issues) but more that bank capital requirements are not constructed in a way that compensates for the inherent cyclicality of the banking industry. The ideas that Adrian has listed above are potentially part of the solution as is revisiting the way that the Counter cyclical Capital Buffer is intended to operate.

From the Outside

 

Worth Reading “The Money Formula” by Paul Wilmott and David Orrell.

The full title of this book, co-written by Paul Wilmott and David Orrell, is “The Money Formula: Dodgy Finance, Pseudo Science, and How Mathematicians Took over the Markets“. There are plenty of critiques of modelling and quantitative finance by outsiders throwing rocks but Wilmott is a quant and brings an insider’s technical knowledge to the question of what these tools can do, can’t do and perhaps most importantly should not be used to do. Consequently, the book offers a more nuanced perspective on the strengths and limitations of quantitative finance as opposed to the let’s scrap the whole thing school of thought. I have made some more detailed notes which follow the structure of the book but this post focuses on a couple of ideas I found especially interesting or useful.

I am not a quant so my comments should be read with that in mind but the core idea I took away is that, much as quants would want it otherwise, markets are not determined by fundamental laws, deterministic or probabilistic that allow risk to be measured with precision. These ideas work reasonably well within their “zone of validity” but a more complete answer (or model) has to recognise where the zones stop and uncertainty rules.  Wilmott and Orrell argue market outcomes are better thought of as the “emergent result of complex transactions”. The role of money in these emergent results is especially important, as is the capacity of models themselves to materially reshape the risk of the markets they are attempting to measure.

The Role of Money

Some quotes I have drawn from Chapter 8, will let the authors speak for themselves on the role of money …

Consider …. the nature of money. Standard economic definitions of money concentrate on its roles as a “medium of exchange,” a “store of value,” and a “unit of account.” Economists such as Paul Samuelson have focused in particular on the first, defining money as “anything that serves as a commonly accepted medium of exchange.” … ” Money is therefore not something important in itself; it is only a kind of token. The overall picture is of the economy as a giant barter system, with money acting as an inert facilitator.” (emphasis added)

“However … money is far more interesting than that, and actually harbors its own kind of lively, dualistic properties. In particular, it merges two things, number and value, which have very different properties:number lives in the abstract, virtual world of mathematics, while valued objects live in the real world. But money seems to be an active part of the system. So ignoring it misses important relationships. The tension between these contradictory aspects is what gives money its powerful and paradoxical qualities.” (Emphasis added)

The real and the virtual become blurred, in physics or in finance. And just as Newtonian theories break down in physics, so our Newtonian approach to money breaks down in economics. In particular, one consequence is that we have tended to take debt less seriously than we should. (emphasis added)

Instead of facing up to the intrinsically uncertain nature of money and the economy, relaxing some of those tidy assumptions, accepting that markets have emergent properties that resist reduction to simple laws, and building a new and more realistic theory of economics, quants instead glommed on to the idea that, when a system is unpredictable, you can just switch to making probabilistic predictions.” (emphasis added)

“The efficient market hypothesis, for example, was based on the mechanical analogy that markets are stable and perturbed randomly by the actions of atomistic individuals. This led to probabilistic risk-analysis tools such as VaR. However, in reality, the “atoms” are not independent, but are closely linked … The result is the non-equilibrium behaviour … observed in real markets. Markets are unpredictable not because they are efficient, but because of a financial version of the uncertainty principle.” (emphasis added)

 The Role of Models

Wilmott & Orrell devote a lot of attention to the ways in which models no longer just describe, but start to influence, the markets being modelled mostly by encouraging people to take on more risk based in part on a false sense of security …

“Because of the bankers’ insistence on treating complex finance as a university end-of-term exam in probability theory, many of the risks in the system are hidden. And when risks are hidden, one is led into a false sense of security. More risk is taken so that when the inevitable happens, it is worse than it could have been. Eventually the probabilities break down, disastrous events become correlated, the cascade of dominoes is triggered, and we have systemic risk …. None of this would matter if the numbers were small … but the numbers are huge” (Chapter 10 – emphasis added)

They see High Frequency Trading as the area likely to give rise to a future systemic crisis but also make a broader point about the tension between efficiency and resilience..

“With complex systems, there is usually a trade-off between efficiency and robustness …. Introducing friction into the system – for example by putting regulatory brakes on HFT – will slow the markets, but also make them more transparent and reliable. If we want a more robust and resilient system then we probably need to agree to forego some efficiency” (Chapter 10 – emphasis added)

The Laws of Finance

Wilmott and Orrell note the extent to which finance has attempted to identify laws which are analogous to the laws of physics and the ways in which these “laws” have proved to be more of a rough guide.

 “… the “law of supply and demand” …states that the market for a particular product has a certain supply, which tends to increase as the price goes up (more suppliers enter the market). There is also a certain demand for the product, which increases as the price goes down.”

“… while the supply and demand picture might capture a general fuzzy principle, it is far from being a law. For one thing, there is no such thing as a stable “demand” that we can measure independently –there are only transactions.”

“Also, the desire for a product is not independent of supply, or other factors, so it isn’t possible to think of supply and demand as two separate lines. Part of the attraction of luxury goods –or for that matter more basic things, such as housing –is exactly that their supply is limited. And when their price goes up, they are often perceived as more desirable, not less.” (emphasis added)

This example is relevant for banking systems (such as Australia) where residential mortgage lending dominates the balance sheets of the banks. Even more so given that public debate of the risk associated with housing seems often to be predicated on the economics 101 version of the laws of supply and demand.

The Power (and Danger) of Ideas

A recurring theme throughout the book is the ways in which economists and quants have borrowed ideas from physics without recognising the limitations of the analogies and assumptions they have relied on to do so. Wilmott and Orrell credit Sir Issac Newton as one of the inspirations behind Adam Smith’s idea of the “Invisible Hand” co-ordinating  the self interested actions of individuals for the good of society. When the quantum revolution saw physics embrace a probabilistic approach, economists followed.

I don’t think Wilmott and Orrell make this point directly but a recurring thought reading the book was the power of ideas to not just interpret the underlying reality but also to shape the way the economy and society develops not always for the better.

  • Economic laws that drive markets towards equilibrium as their natural state
  • The “invisible hand” operating in markets to reconcile individual self interest with optimal outcomes for society as a whole
  • The Efficient Market Hypothesis as an explanation for why markets are unpredictable

These ideas have widely influenced quantitative finance in a variety of domains and they all contribute useful insights; the key is to not lose sight of their zone of validity.

…. Finance … took exactly the wrong lesson from the quantum revolution. It held on to its Newtonian, mechanistic, symmetric picture of an intrinsically stable economy guided to equilibrium by Adam Smith’s invisible hand. But it adopted the probabilistic mathematics of stochastic calculus.” (emphasis added) Chapter 8

Where to from here?

It should be obvious by now that the authors are arguing that risk and reward cannot be reduced to hard numbers in the ways that physics has used similar principles and tools to generate practical insights into how the world works. Applying a bit of simple math in finance seems to open up the door to getting some control over an unpredictable world and, even better, to pursue optimisation strategies that allow the cognoscenti to optimise the balance between risk and reward. There is room for more complex math as well for those so inclined but the book sides with the increasingly widely held views that simple math is enough to get you into trouble and further complexity is best avoided if possible.

Wilmott and Orrell highlight mathematical biology in general and a book by Jim Murray on the topic as a source for better ways to approach many of the more difficult modelling challenges in finance and economics. They start by listing a series of phenomena in biological models that seem to be useful analogues for what happens in financial markets. They concede that a number of models used in mathematical biology that are almost all “toy” models. None of these models offer precise or determined outcomes but all can be used to explain what is happening in nature and offer insights into solutions for problems like disease control, epidemics, conservation etc.

The approach they advocate seems have a lot in common with the Agent Based Modelling approach that Andrew Haldane references (see his paper on “Tails of the Unexpected“) and that is the focus of Bookstabber’s book (“The End of Theory”).

In their words …

“Embrace the fact that the models are toy, and learn to work within any limitations.”

Focus more attention on measuring and managing resulting model risk, and less time on complicated new products.”

“… only by remaining both skeptical and agile can we learn. Keep your models simple, but remember they are just things you made up, and be ready to update them as new information comes in.”

I fear I have not done the book justice but I got a lot out of it and can recommend it highly.

 

 

The financial cycle and macroeconomics: What have we learnt? BIS Working Paper

Claudio Borio at the BIS wrote an interesting paper exploring the “financial cycle”. This post seeks to summarise the key points of the paper and draw out some implications for bank stress testing (the original paper can be found here).  The paper was published in December 2012, so its discussion of the implications for macroeconomic modelling may be dated but I believe it continues to have some useful insights for the challenges banks face in dealing with adverse economic conditions and the boundary between risk and uncertainty.

Key observations Borio makes regarding the Financial Cycle

The concept of a “business cycle”, in the sense of there being a regular occurrence of peaks and troughs in business activity, is widely known but the concept of a “financial cycle” is a distinct variation on this theme that is possibly less well understood. Borio states that there is no consensus definition but he uses the term to

“denote self-reinforcing interactions between perceptions of value and risk, attitudes towards risk and financing constraints, which translate into booms followed by busts. These interactions can amplify economic fluctuations and possibly lead to serious financial distress and economic disruption”.

This definition is closely related to the concept of “procyclicality” in the financial system and should not be confused with a generic description of cycles in economic activity and asset prices. Borio does not use these words but I have seen the term “balance sheet recession” employed to describe much the same phenomenon as Borio’s financial cycle.

Borio identifies five features that describe the Financial Cycle

  1. It is best captured by the joint behaviour of credit and property prices – these variables tend to closely co-vary, especially at low frequencies, reflecting the importance of credit in the financing of construction and the purchase of property.
  2. It is much longer, and has a much larger amplitude, than the traditional business cycle – the business cycle involves frequencies from 1 to 8 years whereas the average length of the financial cycle is longer; Borio cites a cycle length of 16 years in a study of seven industrialised economies and I have seen other studies indicating a longer cycle (with more severe impacts).
  3. It is closely associated with systemic banking crises which tend to occur close to its peak.
  4. It permits the identification of the risks of future financial crises in real time and with a good lead – Borio states that the most promising leading indicators of financial crises are based on simultaneous positive deviations of the ratio of private sector credit-to-GDP and asset prices, especially property prices, from historical norms.
  5. And it is highly dependent of the financial, monetary and real-economy policy regimes in place (e.g. financial liberalisation under Basel II, monetary policy focussed primarily on inflation targeting and globalisation in the real economy).

Macro economic modelling

Borio also argues that the conventional models used to analyse the economy are deficient because they do not capture the dynamics of the financial cycle. These extracts capture the main points of his critique:

“The notion… of financial booms followed by busts, actually predates the much more common and influential one of the business cycle …. But for most of the postwar period it fell out of favour. It featured, more or less prominently, only in the accounts of economists outside the mainstream (eg, Minsky (1982) and Kindleberger (2000)). Indeed, financial factors in general progressively disappeared from macroeconomists’ radar screen. Finance came to be seen effectively as a veil – a factor that, as a first approximation, could be ignored when seeking to understand business fluctuations … And when included at all, it would at most enhance the persistence of the impact of economic shocks that buffet the economy, delaying slightly its natural return to the steady state …”

“Economists are now trying hard to incorporate financial factors into standard macroeconomic models. However, the prevailing, in fact almost exclusive, strategy is a conservative one. It is to graft additional so-called financial “frictions” on otherwise fully well behaved equilibrium macroeconomic models, built on real-business-cycle foundations and augmented with nominal rigidities. The approach is firmly anchored in the New Keynesian Dynamic Stochastic General Equilibrium (DSGE) paradigm.”

“The purpose of this essay is to summarise what we think we have learnt about the financial cycle over the last ten years or so in order to identify the most promising way forward…. The main thesis is that …it is simply not possible to understand business fluctuations and their policy challenges without understanding the financial cycle”

There is an interesting discussion of the public policy (i.e. prudential, fiscal, monetary) associated with recognising the role of the financial cycle but I will focus on what implications this may have for bank management in general and stress testing in particular.

Insights and questions we can derive from the paper

The observation that financial crises are based on simultaneous positive deviations of the ratio of private sector credit-to-GDP and asset prices, especially property prices, from historical norms covers much the same ground as the Basel Committee’s Countercyclical Capital Buffer (CCyB) and is something banks would already monitor as part of the ICAAP. The interesting question the paper poses for me is the extent to which stress testing (and ICAAP) should focus on a “financial cycle” style disruption as opposed to a business cycle event. Even more interesting is the question of whether the higher severity of the financial cycle is simply an exogenous random variable or an endogenous factor that can be attributed to excessive credit growth. 

I think this matters because it has implications for how banks calibrate their overall risk appetite. The severity of the downturns employed in stress testing has in my experience gradually increased over successive iterations. My recollection is that this has partly been a response to prudential stress tests which were more severe in some respects than might have been determined internally. In the absence of any objective absolute measure of what was severe, it probably made sense to turn up the dial on severity in places to align as far as possible the internal benchmark scenarios with prudential benchmarks such as the “Common Scenario” APRA employs.

At the risk of a gross over simplification, I think that banks started the stress testing process looking at both moderate downturns (e.g. 7-10 year frequency and relatively short duration) and severe recessions (say a 25 year cycle though still relatively short duration downturn). Bank supervisors  in contrast have tended to focus more on severe recession and financial cycle style severity scenarios with more extended durations. Banks’s have progressively shifted their attention to scenarios that are more closely aligned to the severe recession assumed by supervisors in part because moderate recessions tend to be fairly manageable from a capital management perspective.

Why does the distinction between the business cycle and the financial cycle matter?

Business cycle fluctuations (in stress testing terms a “moderate recession”) are arguably an inherent feature of the economy that occur largely independently of the business strategy and risk appetite choices that banks make. However, Borio’s analysis suggests that the decisions that banks make (in particular the rate of growth in credit relative to growth in GDP and the extent to which the extension of bank credit contributes to inflated asset values) do contribute to the risk (i.e. probability, severity and duration) of a severe financial cycle style recession. 

Borio’s analysis also offers a way of thinking about the nature of the recovery from a recession. A moderate business cycle style recession is typically assumed to be short with a relatively quick recovery whereas financial cycle style recessions typically persist for some time. The more drawn out recovery from a financial cycle style recession can be explained by the need for borrowers to deleverage and repair their balance sheets as part of the process of addressing the structural imbalances that caused the downturn.

If the observations above are true, then they suggest a few things to consider:

  • should banks explore a more dynamic approach to risk appetite limits that incorporated the metrics identified by Borio (and also used in the calibration of the CCyB) so that the level of risk they are willing to take adjusts for where they believe they are in the state of the cycle (and which kind of cycle we are in)
  • how should banks think about these more severe financial cycle losses? Their measure of Expected Loss should clearly incorporate the losses expected from business cycle style moderate recessions occurring once every 7-10 years but it is less clear that the kinds of more severe and drawn out losses expected under a Severe Recession or Financial Cycle downturn should be part of Expected Loss.

A more dynamic approach to risk appetite get us into some interesting game theory  puzzles because a decision by one bank to pull back on risk appetite potentially allows competitors to benefit by writing more business and potentially doubly benefiting to the extent that the decision to pull back makes it safer for competitors to write the business without fear of a severe recession (in technical economist speak we have a “collective action” problem). This was similar to the problem APRA faced when it decided to impose “speed limits” on certain types of lending in 2017. The Royal Commission was not especially sympathetic to the strategic bind banks face but I suspect that APRA understand the problem.

How do shareholders think about these business and financial cycle losses? Some investors will adopt a “risk on-risk off” approach in which they attempt to predict the downturn and trade in and out based on that view, other “buy and hold” investors (especially retail) may be unable or unwilling to adopt a trading approach.

The dependence of the financial cycle on the fiscal and monetary policy regimes in place and changes in the real-economy also has potential implications for how banks think about the risk of adverse scenarios playing out. Many of the factors that Borio argues have contributed to the financial cycle (i.e. financial liberalisation under Basel II, monetary policy focussed primarily on inflation targeting and globalisation in the real economy) are reversing (regulation of banks is much more restrictive, monetary policy appears to have recognised the limitations of a narrow inflation target focus and the pace of globalisation appears to be slowing in response to a growing concern that its benefits are not shared equitably). I am not sure exactly what these changes mean other than to recognise that they should in principle have some impact. At a minimum it seems that the pace of credit expansion might be slower in the coming decades than it has in the past 30 years.

All in all, I find myself regularly revisiting this paper, referring to it or employing the distinction between the business and financial cycle. I would recommend it to anyone interested in bank capital management. 

The rise of the normal distribution

“We were all Gaussians now”

This post focuses on a joint paper written in 2012 by Andrew Haldane and Benjamin Nelson titled “Tails of the unexpected”. The topic is the normal distribution which is obviously a bit technical but the paper is still readable even if you are not deeply versed in statistics and financial modelling. The condensed quote below captures the central idea I took away from the paper.

“For almost a century, the world of economics and finance has been dominated by randomness … But as Nassim Taleb reminded us, it is possible to be Fooled by Randomness (Taleb (2001)). For Taleb, the origin of this mistake was the ubiquity in economics and finance of a particular way of describing the distribution of possible real world outcomes. For non-nerds, this distribution is often called the bell-curve. For nerds, it is the normal distribution. For nerds who like to show-off, the distribution is Gaussian.”

The idea that the normal distribution should be used with care, and sometimes not at all, when seeking to analyse economic and financial systems is not news. The paper’s discussion of why this is so is useful if you have not considered the issues before but probably does not offer much new insight if you have.

What I found most interesting was the back story behind the development of the normal distribution. In particular, the factors that Haldane and Nelson believe help explain why it came to be so widely used and misused. Reading the history reminds us of what a cool idea it must have been when it was first discovered and developed.

“By simply taking repeat samplings, the workings of an uncertain and mysterious world could seemingly be uncovered”.
“To scientists seeking to explain the world, the attraction of the normal curve was obvious. It provided a statistical map of a physical world which otherwise appeared un-navigable. It suggested regularities in random real-world data. Moreover, these patterns could be fully described by two simple metrics – mean and variance. A statistical window on the world had been opened.”
Haldane and Nelson highlight a semantic shift in the 1870’s where the term “normal” began to be independently applied to this statistical distribution. They argue that adopting this label helped embed the idea that the “normal distribution” was the “usual” outcome that one should expect to observe. 
“In the 18th century, normality had been formalised. In the 19th century, it was socialised.”
“Up until the late 19th century, no statistical tests of normality had been developed.
Having become an article of faith, it was deemed inappropriate to question the faith.
As Hacking put it, “thanks to superstition, laziness, equivocation, befuddlement with tables of numbers, dreams of social control, and propaganda from utilitarians, the law of large numbers became a synthetic a priori truth. We were all Gaussians now.”

Notwithstanding its widespread use today, in Haldane and Nelson’s account, economics and finance were not early adopters of the statistical approach to analysis but eventually become enthusiastic converts. The influence of physics on the analytical approaches employed in economics is widely recognised and Haldane cites the rise of probability based quantum physics over old school deterministic Newtonian physics as one of the factors that prompted economists to embrace probability and the normal distribution as a key tool.

” … in the early part of the 20th century, physics was in the throes of its own intellectual revolution. The emergence of quantum physics suggested that even simple systems had an irreducible random element. In physical systems, Classical determinism was steadily replaced by statistical laws. The natural world was suddenly ruled by randomness.”
“Economics followed in these footsteps, shifting from models of Classical determinism to statistical laws.”
“Whether by accident or design, finance theorists and practitioners had by the end of the 20th century evolved into fully paid-up members of the Gaussian sect.”

Assessing the Evidence

Having outlined the story behind its development and increasingly widespread use, Haldane and Nelson then turn to the weight of evidence suggesting that normality is not a good statistical description of real-world behaviour. In its place, natural and social scientists have often unearthed behaviour consistent with an alternative distribution, the so-called power law distribution.
“In consequence, Laplace’s central limit theorem may not apply to power law-distributed variables. There can be no “regression to the mean” if the mean is ill-defined and the variance unbounded. Indeed, means and variances may then tell us rather little about the statistical future. As a window on the world, they are broken”
This section of the paper probably does not introduce anything new to people who have spent any time looking at financial models. It does however beg some interesting questions. For example, to what extent bank loan losses are better described by a power law and, if so, what does this mean for the measures of expected loss that are employed in banking and prudential capital requirements; i.e. how should banks and regulators respond if “…the means and variances … tell us rather little about the statistical future”? This is particularly relevant as banks transition to Expected Loss accounting for loan losses.
We can of course estimate the mean loss under the benign part of the credit cycle but it is much harder to estimate a “through the cycle” average (or “expected” loss) because the frequency, duration and severity of the cycle downturn is hard to pin down with any precision. We can use historical evidence to get a sense of the problem; we can for example talk about moderate downturns say every 7-10 years with more severe recessions every 25-30 years and a 75 year cycle for financial crises. However the data is obviously sparse so it does not allow the kind of precision that is part and parcel of normally distributed events.

Explaining Fat Tails

The paper identifies the following drivers behind non-normal outcomes:
  • Non- Linear dynamics
  • Self organised criticality
  • Preferential attachment
  • Highly optimised tolerance
The account of why systems do not conform to the normal distribution does not offer much new but I found reading it useful for reflecting on the practical implications. One of the items they called out is competition which is typically assumed by economists to be a wholly benign force. This is generally true but Haldane and Nelson note the capacity for competition to contribute to self-organised criticality.
Competition in finance and banking can of course lead to beneficial innovation and efficiency gains but it can also contribute to progressively increased risk taking (e.g. more lax lending standards, lower margins for tail risk) thereby setting the system up to be prone to a self organised critical state. Risk based capital requirements can also contribute to self organised criticality to the extent they facilitate increased leverage and create incentives to take on tail risk.

Where Next?

Haldane and Nelson add their voice to the idea that Knight’s distinction between risk and uncertainty is a good foundation for developing better ways of dealing with a world that does not conform to the normal distribution and note the distinguishied company that have also chosen to emphasise the importance of uncertainty and the limitations of risk.
“Many of the biggest intellectual figures in 20th century economics took this distinction seriously. Indeed, they placed uncertainty centre-stage in their policy prescriptions. Keynes in the 1930s, Hayek in the 1950s and Friedman in the 1960s all emphasised the role of uncertainty, as distinct from risk, when it came to understanding economic systems. Hayek criticised economics in general, and economic policymakers in particular, for labouring under a “pretence of knowledge.”
Assuming that the uncertainty paradigm was embraced, Haldane and Nelson consider what the practical implications would be. They have a number of proposals but I will focus on these
  • agent based modelling
  • simple rather than complex
  • don’t aim to smooth out all volatility

Agent based modelling

Haldane and Nelson note that …

In response to the crisis, there has been a groundswell of recent interest in modelling economic and financial systems as complex, adaptive networks. For many years, work on agent-based modelling and complex systems has been a niche part of the economics and finance profession. The crisis has given these models a new lease of life in helping explain the discontinuities evident over recent years (for example, Kirman (2011), Haldane and May (2011))
In these frameworks, many of the core features of existing models need to be abandoned.
  • The “representative agents” conforming to simple economic laws are replaced by more complex interactions among a larger range of agents
  • The single, stationary equilibrium gives way to Lorenz-like multiple, non-stationary equilibria.
  • Linear deterministic models are usurped by non linear tipping points and phase shifts
Haldane and Nelson note that these types of systems are already being employed by physicists, sociologists, ecologists and the like. Since the paper was written (2012) we have seen some evidence that economists are experimenting with “agent based modelling”. A paper by Richard Bookstabber offers a useful outline of his efforts to apply these models and he has also written a book (“The End of Theory”) promoting this path. There is also a Bank of England paper on ABM worth looking at.
I think there is a lot of value in agent based modelling but a few things impede their wider use. One is that the models don’t offer the kinds of precision that make the DSGE and VaR models so attractive. The other is that they require a large investment of time to build and most practitioners are fully committed just keeping the existing models going. Finding the budget to pioneer an alternative path is not easy. These are not great arguments in defence of the status quo but they do reflect certain realities of the world in which people work.

Simple can be more robust than complex

Haldane and Nelson also advocate simplicity in lieu of complexity as a general rule of thumb for dealing with an uncertain world.
The reason less can be more is that complex rules are less robust to mistakes in specification. They are inherently fragile. Harry Markowitz’s mean-variance optimal portfolio model has informed millions of investment decisions over the past 50 years – but not, interestingly, his own. In retirement, Markowitz instead used a much simpler equally-weighted asset approach. This, Markowitz believed, was a more robust way of navigating the fat-tailed uncertainties of investment returns (Benartzi and Thaler (2001)).
I am not a big fan of the Leverage Ratio they cite it as one example of regulators beginning to adopt simpler approaches but the broader principle that simple is more robust than complex does ring true.
The mainstay of regulation for the past 30 years has been more complex estimates of banks’ capital ratios. These are prone to problems of highly-optimised tolerance. In part reflecting that, regulators will in future require banks to abide by a far simpler backstop measure of the leverage ratio. Like Markowitz’s retirement portfolio, this equally-weights the assets in a bank’s portfolio. Like that portfolio, it too will hopefully be more robust to fat-tailed uncertainties.
Structural separation is another simple approach to the problem of making the system more resilient
A second type of simple, yet robust, regulatory rule is to impose structural safeguards on worst-case outcomes. Technically, this goes by the name of a “minimax” strategy (Hansen and Sargent (2011)). The firebreaks introduced into some physical systems can be thought to be playing just this role. They provide a fail-safe against the risk of critical states emerging in complex systems, either in a self-organised manner or because of man-made intervention. These firebreak-type approaches are beginning to find their way into the language and practice of regulation.
And a reminder about the dangers of over engineering
Finally, in an uncertain world, fine-tuned policy responses can sometimes come at a potentially considerable cost. Complex intervention rules may simply add to existing uncertainties in the system. This is in many ways an old Hayekian lesson about the pretence of knowledge, combined with an old Friedman lesson about the avoidance of policy harm. It has relevance to the (complex, fine-tuned) regulatory environment which has emerged over the past few years.
While we can debate the precise way to achieve simplicity, the basic idea does in my view have a lot of potential to improve the management of risk in general and bank capital in particular. Complex intervention rules may simply add to existing uncertainties in the system and the current formulation of how the Capital Conservation Ratio interacts with the Capital Conservation Buffer is a case in point. These two elements of the capital adequacy framework define what percentage of a bank’s earnings must be retained if the capital adequacy ratio is under stress.
In theory the calculation should be simple and intuitive but anyone who has had to model how these rules work under a stress scenario will know how complex and unintuitive the calculation actually is. The reasons why this is so are probably a bit too much detail for today but I will try to pick this topic up in a future post.

Don’t aim to eliminate volatility

Systems which are adapted to volatility will tend to be stronger than systems that are sheltered from it, or in the words of Haldane and Nelson …

“And the argument can be taken one step further. Attempts to fine-tune risk control may add to the probability of fat-tailed catastrophes. Constraining small bumps in the road may make a system, in particular a social system, more prone to systemic collapse. Why? Because if instead of being released in small bursts pressures are constrained and accumulate beneath the surface, they risk an eventual volcanic eruption.”

I am a big fan of this idea. Nassim Taleb makes a similar argument in his book “Antifragile” as does Greg Ip in “Foolproof”. It also reflects Nietzsche’s somewhat more poetic dictum “that which does not kills us makes us stronger”.

In conclusion

If you have read this far then thank you. I hope you found it useful and interesting. If you want to delve deeper then you can find my more detailed summary and comments on the paper here. If you think I have any of the above wrong then please let me know.

Swiss money experiment

Last month I posted a review of Mervyn King’s book “The end of Alchemy”. One of the central ideas in King’s book was that all deposits must be backed 100% by liquid, safe assets. It appears that the Swiss are being asked to vote on a proposal labeled “Sovereign Money Initiative” that may not be exactly the same as King’s idea but comes from the same school of money philosophy.

It is not clear that there is any popular support for the proposal but it would be a fascinating money experiment if it did get support. Thanks to Brian Reid for flagging this one to me.

Tony

 

 

Looking under the hood – The IRB formula

This post is irredeemably technical so stop here if that is not your interest. If you need to understand some of the mechanics of the formula used to calculate credit risk weighted assets under the advanced Internal Ratings Based (IRB) approach then the BCBS published a paper in 2005 which offers an explanation:

  • describing the economic foundations
  • as well as the underlying mathematical model and its input parameters.

While a lot has changed as a result of Basel III, the models underlying the calculation of Internal Rating Based Capital (IRB) requirements are still based on the core principles agreed under Basel II that are explained in this BCBS paper.

The notes in the linked page below mostly summarise the July 2005 paper with some emphasis (bolded text) and comments (in italics) that I have added. The paper is a bit technical but worth reading if you want to understand the original thinking behind the Basel II risk weights for credit risk.

I initially found the paper useful for revisiting the foundation assumptions of the IRB framework as background to considering the regulatory treatment of Expected Loss as banks transition to IFRS9. The background on how the RW was initially intended to cover both Expected and Unexpected Loss, but was revised such that capital was only required to cover Unexpected Loss, is especially useful when considering the interaction of loan loss provisioning with capital requirements.

Reading the BCBS paper has also been useful for thinking through a range of related issues including:

  • The rationale for, and impact of, prudential conservatism in setting the risk parameters used in the IRB formula
  • The cyclicality of a risk sensitive capital requirement (and potential for pro cyclicality) and what might be done to mitigate the risk of pro-cyclical impacts on the economy

If you have read this far then my summary of the BCBS paper and my comments /observations can be found here (and thank you).

I am not a credit risk model expert, so the summary of the paper and my comments must be read with that in mind. I did this to help me think through some of the issues with bank capital adequacy. Hopefully others will find the notes useful. If you see something wrong or something you disagree with then let me know.

“The Great Divide” by Andrew Haldane

This speech by Andrew Haldane (Chief Economist at the Bank of England) was given in 2016 but is sill worth reading for anyone interested in the question of what role banks play in society and why their reputation is not what it once was. Some of my long term correspondents will be familiar with the paper and may have seen an earlier draft of this post.

“The Great Divide” refers to a gap between how banks perceive themselves and how they are perceived by the community. Haldane references a survey the BOE conducted in which the most common word used by banks to describe themselves was “regulated” while “corrupt” was the community choice closely followed by “manipulated”, “self-serving”, “destructive” and “greedy”. There is an interesting “word cloud” chart in the paper representing this gap in perception.

While the focus is on banks, Haldane makes the point that the gap in perceptions reflects a broader tension between the “elites” and the common people. He does not make this explicit connection but it seemed to me that the “great divide” he was referencing could also be argued to be manifesting itself in the increasing support for populist political figures purporting to represent the interests of the common people against career politicians. This broader “great divide” idea seemed to me to offer a useful framework for thinking about the challenges the banking industry is facing in rebuilding trust.

Haldane uses this “great divide” as a reference for discussing

  • The crucial role finance plays in society
  • The progress made so far in restoring trust in finance
  • What more needs to be done

The crucial role finance plays in society

Haldane argues that closing the trust deficit between banks and society matters for two reasons

  • because a well functioning financial system is an essential foundation for a growing and well functioning economy – to quote Haldane “that is not an ideological assertion from the financial elite; it is an empirical fact”
  • but also because the downside of a poorly functioning financial system is so large

Haldane uses the GFC to illustrate the downside in terms of the destruction of the value of financial capital and physical capital but he introduces a third form of capital, “social capital” that he argues may matter every bit as much to the wealth and well being of society. He defines social capital as the “relationships, trust and co-operation forged between different groups of people over time. It is the sociological glue that binds diverse societies into a cohesive whole”. The concept of “trust” is at the heart of Haldane’s definition of social capital.

Haldane cites evidence that trust plays an important role at both the micro and macro level in value creation and growth and concludes that “… a lack of trust jeopardises one of finance’s key societal functions – higher growth”.

In discussing these trends, Haldane distinguishes “personalised trust” and “generalised trust“. The former refers to mutual co-operation built up through repeated personal interactions (Haldane cites example like visits to the doctor or hairdresser) while the latter is attached to an identifiable but anonymous group (Haldane cites trust in the rule of law, or government or Father Christmas).

He uses this distinction to explore why banks have lost the trust of the community;

He notes that banking was for most of its history a relationship based business. The business model was not perfect but it did deliver repeated interactions with customers that imbued banking with personalised trust. At the same time its “mystique” (Haldane’s term) meant that banking maintained a high degree of generalised trust as well.

He cites the reduction in local branches, a common strategy pre GFC, as one of the changes that delivered lower costs but reduced personal connections thereby contributing to reducing personalised trust. For a while, the banking system could reap the efficiency gains while still relying on generalised trust but the GFC subsequently undermined the generalised trust in the banking system. This generalised trust has been further eroded by the continued run of banking scandals that convey the sense that banks do not care about their customers.

What can be done to restore trust in finance

He notes the role that higher capital and liquidity have played but that this is not enough in his view. He proposes three paths

  1. Enhanced public education
  2. Creating “Purpose” in banking
  3. Communicating “Purpose” in banking

Regarding public education, there is a telling personal anecdote he offers on his experience with pensions. He describes himself as “moderately financially literate” but follows with “Yet I confess to not being able to make the remotest sense of pensions. Conversations with countless experts and independent financial advisors have confirmed for me only one thing – that they have no clue either”. This may be dismissed as hyperbole but it does highlight that most people will be less financially literate than Haldane and are probably poorly equipped to deal with the financial choices they are required to make in modern society. I am not sure that education is the whole solution.

Regarding “purpose” Haldane’s main point seems to be that there is too much emphasis on shareholder value maximisation and not enough balance. This seems to be an issue that is amplified by the UK Companies Act that requires that directors place shareholder interests as their primary objective. To the best of my knowledge, the Australian law does not have an equivalent explicit requirement to put shareholders first but we do grapple with the same underlying problem. Two of my recent posts (“The World’s Dumbest Idea” and “The Moral Economy” touch on this issue.

Regarding communicating purpose, Haldane cites some interesting evidence that the volume of information provided by companies is working at cross purposes with actual communication with stakeholders. Haldane does not make the explicit link but Pillar 3 clearly increases the volume of information provided by banks. The points raised by Haldane imply (to me at least) that Pillar 3 might actually be getting in the way of communicating clearly with stakeholders.

This is a longish post but I think there is quite a lot of useful content in the speech so I would recommend it.

Recently read – “The Moral Economy: Why Good Incentives Are No Substitute For Good Citizens” by Samuel Bowles

The potential for incentives to create bad behaviour has been much discussed in the wake of the GFC while the Financial Services Royal Commission in Australia has provided a fresh set of examples of bankers behaving badly. It is tempting of course to conclude that bankers are just morally corrupt but, for anyone who wants to dig deeper, this book offers an interesting perspective on the role of incentives in the economy.

What I found especially interesting is Bowles account of the history of how the idea that good institutions and a free market based economy could “harness self interest to the public good” has come to dominate so much of current economic and public policy. Building on this foundation, the book examines the ways in which incentives designed around the premise that people are solely motivated by self interest can often be counter-productive; either by crowding out desirable behaviour or by prompting people to behave in ways that are the direct opposite of what was intended.

Many parts of this story are familiar but it was interesting to see how Bowles charted the development of the idea over many centuries and individual contributors. People will no doubt be familiar with Adam Smith’s “Invisible Hand”  but Bowles also introduces other thinkers who contributed to this conceptual framework, Machiavelli and David Hume in particular. The idea is neatly captured in this quote from Hume’s Essays: Moral, Political and Literary (1742) in which he recommended the following maxim

“In contriving any system of government … every man ought to be supposed to be a knave and to have no other end … than private interest. By this interest we must govern him, and, by means of it, make him notwithstanding his insatiable avarice and ambition, cooperate to public good” .

Bowles makes clear that this did not mean that people are in fact solely motivated by self-interest (i.e “knaves”), simply that civic virtue (i.e. creating good people) by itself was not a robust platform for achieving good outcomes. The pursuit of self interest, in contrast, came to be seen as a benign activity that could be harnessed for a higher purpose.

The idea of embracing self-interest is of course anathema to many people but its intellectual appeal is I think obvious.  Australian readers at this point might be reminded of Jack Lang’s maxim “In the race of life, always back self-interest; at least you know it’s trying“. Gordon Gekko’s embrace of the principle that “Greed is good” is the modern expression of this intellectual tradition.

Harnessing self-interest for the common good

Political philosophers had for centuries focused on the question of how to promote civic virtue but their attention turned to finding laws and other public policies that would allow people to pursue their personal objectives, while also inducing them to take account of the effects of their actions on others. The conceptual foundations laid down by David Hume and Adam Smith were progressively built on with competition and well defined property rights coming to be seen as important parts of the solution.

“Good institutions displaced good citizens as the sine qua non of good government. In the economy, prices would do the work of morals”

“Markets thus achieved a kind of moral extraterritoriality … and so avarice, repackaged as self-interest, was tamed, transformed from a moral failing to just another kind of motive”

Free market determined prices were at the heart of the system that allowed the Invisible Hand to work its magic but economists recognised that competition alone was not sufficient for market prices to capture everything that mattered. For the market to arrive at the right (or most complete) price, it was also necessary that economic interactions be governed by “complete contracts” (i.e. contracts that specify the rights and duties of the buyer and seller in all future states of the world).

This is obviously an unrealistic assumption. Apart from the difficulty of imagining all future states of the world, not everything of value can be priced. But all was not lost. Bowles introduces Alfred Marshall and Arthur Pigou who identified, in principle, how a system of taxes and subsidies could be devised that compensated economic actors for benefits their actions conferred on others and made them liable for costs they imposed on others.

These taxes and subsidies are of course not always successful and Bowles offers a taxonomy of reasons why this is so. Incentives can work but not, according to Bowles, if they simplistically assume that the target of the incentive cares only about his or her material gain. To be effective, incentives must account for the fact that people are much more complex, social and moral than is strictly rational from an economic perspective. Bowles devotes a lot of the book to the problem with incentives (both positive and negative, including taxes, fines, subsidies, bonuses etc) which he categorises under three headings:

  1. “Bad News“; incentives send a signal and the tendency is for people to read things into incentives which may not have been intended but prompt them to respond negatively (e.g. does this incentive signal that the other party believes I am not trustworthy or lazy)
  2. Moral Disengagement”; the incentive may create a context in which the subject can distance themselves from the moral consequences of how they respond
  3. “Control Aversion”; an incentive that compromises a subject’s sense of autonomy or pride in the task may reduce their intrinsic motivation to perform the task well

Having noted the ways that incentives can have adverse impacts on behaviour, Bowles notes that civic minded values continue to be an important feature of market based economies and examines why this might be.

“If incentives sometimes crowd out ethical reasoning, the desire to help others, and intrinsic motivations, and if leading thinkers celebrate markets as a morality-free zone, it seems just a short step to Karl Marx’s broadside condemnation of capitalist culture”

One answer is that trading in markets encourages people to trust strangers and that the benefits of trading over time teach people that trust is a valuable commodity (the so called “doux commerce” theory).

While admitting his answer is speculative, Bowles rejects “doux commerce” as the whole answer. He argues that the institutions (property rights, rule of law, etc) developed by liberal societies to protect citizens from worst-case outcomes such as personal injury, loss of property, and other calamities make the consequences of mistakenly trusting a defector much less dire. As a result, the rule of law lowers the bar for how much you would have to know about your partner before trusting him or her, thereby promoting the spread of trusting expectations and hence of trusting behavior in a population.

The “institutional structure” theory is interesting but there is still much in the book worth considering even if you don’t buy his explanation. I have some more detailed notes on the book here.

Lessons for banking in Pixar’s approach to dealing with uncertainty and the risk of failure.

The report on the Prudential Inquiry into the CBA (“CBA Report”) is obviously required reading in banking circles this week. Plenty has been written on the topic already so I will try to restrain myself unless I can find something new to add to the commentary. However, while reading the report, I found myself drawing links to books that I think bankers would find well worth reading. These include Foolproof (by Michael Ip) and “The Success Equation: Untangling Skill and Luck in Business, Sports and Investing (by Michael Mauboussin).

I have put up some notes on Foolproof here and intend to do the same for The Success Equation sometime soon. The focus for today’s post however is a book titled “Creativity, Inc” by Ed Catmull who founded and led Pixar. The overall theme of the book is about developing and sustaining a creative culture but dealing with risk and uncertainty emerges as a big part of this.

What does making movies have to do with banking?

One of the lessons Catmull emphasised was that, notwithstanding Pixar’s success, it was important not to lose sight of the role that random factors play in both success and failure. A quote from Ch 8 illustrates this point;

“… a lot of our success came because we had pure intentions and great talent, and we did a lot of things right, but I also believe that attributing our success solely to our own intelligence without acknowledging the role of accidental events, diminishes us.”

He goes on to describe how success can be a trap for the following reasons;

  • it creates the impression that what you are doing must be right,
  • it tempts you to overlook hidden problems and
  • you may be confusing luck with skill.

There is a discussion in Ch 9 of the kinds of things that can lead you to misunderstand the real nature of both your success and your failure. These include various cognitive biases (such as “confirmation” where you weight information that supports what you believe more than the counter evidence) and mental models we use to simplify the world in which we operate. These are hard wired into us so the best we can do is be aware of how these things can take us off track; that at least puts us ahead of those who blindly follow their mental models and biases.

His answer to building the capacity to adapt to change and respond to setbacks is to trust in people but trust does not mean you trust that people won’t make mistakes. Catmull accepts setbacks and screw ups as an inevitable part of being creative and innovative but trust is demonstrated when you support your people when they do screw up and trust them to find the solution.

This is interesting because the CBA Report indicates that CBA did in fact place a great deal of trust in their executive team and senior leaders, which implies trust alone is not enough. The missing ingredients in CBA’S case were accountability and consequence when the team failed to identify, escalate and resolve problems.

The other interesting line of speculation is whether CBA’s risk culture might have benefited from a deeper reflection on the difference between skill and luck. Maboussin’s book (The Success Equation) is particularly good in the way in which he lays out his framework for making this distinction.

I plan to come back to this topic once I have completed a review of Maboussin’s book but in the interim I can recommend all of the books mentioned in this post.