Canada innovates in the capital buffer space

The Canadian prudential regulator (OFSI) has made an interesting contribution to the capital buffer space via its introduction of a Domestic Stability Buffer (DSB).

Key features of the Domestic Stability Buffer:

  • Applies only to Domestic Systemically Important Banks (D-SIB) and intended to cover a range of systemic vulnerabilities not captured by the Pillar 1 requirement
  • Vulnerabilities currently included in the buffer include (i) Canadian consumer indebtedness; (ii) asset imbalances in the Canadian market and (iii) Canadian institutional indebtedness
  • Replaces a previously undisclosed Pillar 2 loading associated with this class of risks (individual banks may still be required to hold a Pillar 2 buffer for idiosyncratic risks)
  • Initially set at 1.5% of Total RWA and will be in the range of 0 to 2.5%
  • Reviewed semi annually (June and December); with the option to change more frequently in exceptional circumstances
  • Increases phased in while decreases take effect immediately

Implications for capital planning:

  • DSB supplements the Pillar 1 buffers (Capital Conservation Buffer, D-SIB surcharge and the Countercyclical Buffer)
  • Consequently, the DSB will not result in banks being subject to the automatic constraints on capital distributions that are applied by the Pillar 1 buffers
  • Banks will be required to disclose that the buffer has been breached and the OFSI will require a remediation plan to restore the buffer

What is interesting:

  • The OFSI argues that translating the existing Pillar 2 requirement into an explicit buffer offers greater transparency which in turn “… will support banks’ ability to use this capital buffer in times of stress by increasing the market’s understanding of the purpose of the buffer and how it should be used”
  • I buy the OFSI rationale for why an explicit buffer with a clear narrative is a more usable capital tool than an undisclosed Pillar 2 requirement with the same underlying rationale
  • The OFSI retains a separate Countercyclical Buffer but this Domestic Stability Buffer seems similar but not identical in its over-riding purpose (to me at least) to the approach that the Bank of England (BoE) has adopted for managing the Countercyclical Buffer.
  • A distinguishing feature of both the BoE and OFSI approaches is linking the buffer to a simple, coherent narrative that makes the buffer more usable by virtue of creating clear expectations of the conditions under which the buffer can be used.

Bottom line is that I see useful features in both the BoE and OFSI approach to dealing with the inherent cyclicality of banking.  I don’t see  either of the proposals doing much to mitigate the cyclicality of banking but I do see them offering more potential for managing the consequences of that cyclicality. Both approaches seem to me to offer material improvements over the Countercyclical Buffer as originally conceived by the BCBS.

It will be interesting to see if APRA chooses to adapt elements of this counter cyclical approach to bank capital requirements.

If I am missing something, please let me know …

From the Outside

Worth Reading “The Money Formula” by Paul Wilmott and David Orrell.

The full title of this book, co-written by Paul Wilmott and David Orrell, is “The Money Formula: Dodgy Finance, Pseudo Science, and How Mathematicians Took over the Markets“. There are plenty of critiques of modelling and quantitative finance by outsiders throwing rocks but Wilmott is a quant and brings an insider’s technical knowledge to the question of what these tools can do, can’t do and perhaps most importantly should not be used to do. Consequently, the book offers a more nuanced perspective on the strengths and limitations of quantitative finance as opposed to the let’s scrap the whole thing school of thought. I have made some more detailed notes which follow the structure of the book but this post focuses on a couple of ideas I found especially interesting or useful.

I am not a quant so my comments should be read with that in mind but the core idea I took away is that, much as quants would want it otherwise, markets are not determined by fundamental laws, deterministic or probabilistic that allow risk to be measured with precision. These ideas work reasonably well within their “zone of validity” but a more complete answer (or model) has to recognise where the zones stop and uncertainty rules.  Wilmott and Orrell argue market outcomes are better thought of as the “emergent result of complex transactions”. The role of money in these emergent results is especially important, as is the capacity of models themselves to materially reshape the risk of the markets they are attempting to measure.

The Role of Money

Some quotes I have drawn from Chapter 8, will let the authors speak for themselves on the role of money …

Consider …. the nature of money. Standard economic definitions of money concentrate on its roles as a “medium of exchange,” a “store of value,” and a “unit of account.” Economists such as Paul Samuelson have focused in particular on the first, defining money as “anything that serves as a commonly accepted medium of exchange.” … ” Money is therefore not something important in itself; it is only a kind of token. The overall picture is of the economy as a giant barter system, with money acting as an inert facilitator.” (emphasis added)

“However … money is far more interesting than that, and actually harbors its own kind of lively, dualistic properties. In particular, it merges two things, number and value, which have very different properties:number lives in the abstract, virtual world of mathematics, while valued objects live in the real world. But money seems to be an active part of the system. So ignoring it misses important relationships. The tension between these contradictory aspects is what gives money its powerful and paradoxical qualities.” (Emphasis added)

The real and the virtual become blurred, in physics or in finance. And just as Newtonian theories break down in physics, so our Newtonian approach to money breaks down in economics. In particular, one consequence is that we have tended to take debt less seriously than we should. (emphasis added)

Instead of facing up to the intrinsically uncertain nature of money and the economy, relaxing some of those tidy assumptions, accepting that markets have emergent properties that resist reduction to simple laws, and building a new and more realistic theory of economics, quants instead glommed on to the idea that, when a system is unpredictable, you can just switch to making probabilistic predictions.” (emphasis added)

“The efficient market hypothesis, for example, was based on the mechanical analogy that markets are stable and perturbed randomly by the actions of atomistic individuals. This led to probabilistic risk-analysis tools such as VaR. However, in reality, the “atoms” are not independent, but are closely linked … The result is the non-equilibrium behaviour … observed in real markets. Markets are unpredictable not because they are efficient, but because of a financial version of the uncertainty principle.” (emphasis added)

 The Role of Models

Wilmott & Orrell devote a lot of attention to the ways in which models no longer just describe, but start to influence, the markets being modelled mostly by encouraging people to take on more risk based in part on a false sense of security …

“Because of the bankers’ insistence on treating complex finance as a university end-of-term exam in probability theory, many of the risks in the system are hidden. And when risks are hidden, one is led into a false sense of security. More risk is taken so that when the inevitable happens, it is worse than it could have been. Eventually the probabilities break down, disastrous events become correlated, the cascade of dominoes is triggered, and we have systemic risk …. None of this would matter if the numbers were small … but the numbers are huge” (Chapter 10 – emphasis added)

They see High Frequency Trading as the area likely to give rise to a future systemic crisis but also make a broader point about the tension between efficiency and resilience..

“With complex systems, there is usually a trade-off between efficiency and robustness …. Introducing friction into the system – for example by putting regulatory brakes on HFT – will slow the markets, but also make them more transparent and reliable. If we want a more robust and resilient system then we probably need to agree to forego some efficiency” (Chapter 10 – emphasis added)

The Laws of Finance

Wilmott and Orrell note the extent to which finance has attempted to identify laws which are analogous to the laws of physics and the ways in which these “laws” have proved to be more of a rough guide.

 “… the “law of supply and demand” …states that the market for a particular product has a certain supply, which tends to increase as the price goes up (more suppliers enter the market). There is also a certain demand for the product, which increases as the price goes down.”

“… while the supply and demand picture might capture a general fuzzy principle, it is far from being a law. For one thing, there is no such thing as a stable “demand” that we can measure independently –there are only transactions.”

“Also, the desire for a product is not independent of supply, or other factors, so it isn’t possible to think of supply and demand as two separate lines. Part of the attraction of luxury goods –or for that matter more basic things, such as housing –is exactly that their supply is limited. And when their price goes up, they are often perceived as more desirable, not less.” (emphasis added)

This example is relevant for banking systems (such as Australia) where residential mortgage lending dominates the balance sheets of the banks. Even more so given that public debate of the risk associated with housing seems often to be predicated on the economics 101 version of the laws of supply and demand.

The Power (and Danger) of Ideas

A recurring theme throughout the book is the ways in which economists and quants have borrowed ideas from physics without recognising the limitations of the analogies and assumptions they have relied on to do so. Wilmott and Orrell credit Sir Issac Newton as one of the inspirations behind Adam Smith’s idea of the “Invisible Hand” co-ordinating  the self interested actions of individuals for the good of society. When the quantum revolution saw physics embrace a probabilistic approach, economists followed.

I don’t think Wilmott and Orrell make this point directly but a recurring thought reading the book was the power of ideas to not just interpret the underlying reality but also to shape the way the economy and society develops not always for the better.

  • Economic laws that drive markets towards equilibrium as their natural state
  • The “invisible hand” operating in markets to reconcile individual self interest with optimal outcomes for society as a whole
  • The Efficient Market Hypothesis as an explanation for why markets are unpredictable

These ideas have widely influenced quantitative finance in a variety of domains and they all contribute useful insights; the key is to not lose sight of their zone of validity.

…. Finance … took exactly the wrong lesson from the quantum revolution. It held on to its Newtonian, mechanistic, symmetric picture of an intrinsically stable economy guided to equilibrium by Adam Smith’s invisible hand. But it adopted the probabilistic mathematics of stochastic calculus.” (emphasis added) Chapter 8

Where to from here?

It should be obvious by now that the authors are arguing that risk and reward cannot be reduced to hard numbers in the ways that physics has used similar principles and tools to generate practical insights into how the world works. Applying a bit of simple math in finance seems to open up the door to getting some control over an unpredictable world and, even better, to pursue optimisation strategies that allow the cognoscenti to optimise the balance between risk and reward. There is room for more complex math as well for those so inclined but the book sides with the increasingly widely held views that simple math is enough to get you into trouble and further complexity is best avoided if possible.

Wilmott and Orrell highlight mathematical biology in general and a book by Jim Murray on the topic as a source for better ways to approach many of the more difficult modelling challenges in finance and economics. They start by listing a series of phenomena in biological models that seem to be useful analogues for what happens in financial markets. They concede that a number of models used in mathematical biology that are almost all “toy” models. None of these models offer precise or determined outcomes but all can be used to explain what is happening in nature and offer insights into solutions for problems like disease control, epidemics, conservation etc.

The approach they advocate seems have a lot in common with the Agent Based Modelling approach that Andrew Haldane references (see his paper on “Tails of the Unexpected“) and that is the focus of Bookstabber’s book (“The End of Theory”).

In their words …

“Embrace the fact that the models are toy, and learn to work within any limitations.”

Focus more attention on measuring and managing resulting model risk, and less time on complicated new products.”

“… only by remaining both skeptical and agile can we learn. Keep your models simple, but remember they are just things you made up, and be ready to update them as new information comes in.”

I fear I have not done the book justice but I got a lot out of it and can recommend it highly.

 

 

The financial cycle and macroeconomics: What have we learnt? BIS Working Paper

Claudio Borio at the BIS wrote an interesting paper exploring the “financial cycle”. This post seeks to summarise the key points of the paper and draw out some implications for bank stress testing (the original paper can be found here).  The paper was published in December 2012, so its discussion of the implications for macroeconomic modelling may be dated but I believe it continues to have some useful insights for the challenges banks face in dealing with adverse economic conditions and the boundary between risk and uncertainty.

Key observations Borio makes regarding the Financial Cycle

The concept of a “business cycle”, in the sense of there being a regular occurrence of peaks and troughs in business activity, is widely known but the concept of a “financial cycle” is a distinct variation on this theme that is possibly less well understood. Borio states that there is no consensus definition but he uses the term to

“denote self-reinforcing interactions between perceptions of value and risk, attitudes towards risk and financing constraints, which translate into booms followed by busts. These interactions can amplify economic fluctuations and possibly lead to serious financial distress and economic disruption”.

This definition is closely related to the concept of “procyclicality” in the financial system and should not be confused with a generic description of cycles in economic activity and asset prices. Borio does not use these words but I have seen the term “balance sheet recession” employed to describe much the same phenomenon as Borio’s financial cycle.

Borio identifies five features that describe the Financial Cycle

  1. It is best captured by the joint behaviour of credit and property prices – these variables tend to closely co-vary, especially at low frequencies, reflecting the importance of credit in the financing of construction and the purchase of property.
  2. It is much longer, and has a much larger amplitude, than the traditional business cycle – the business cycle involves frequencies from 1 to 8 years whereas the average length of the financial cycle is longer; Borio cites a cycle length of 16 years in a study of seven industrialised economies and I have seen other studies indicating a longer cycle (with more severe impacts).
  3. It is closely associated with systemic banking crises which tend to occur close to its peak.
  4. It permits the identification of the risks of future financial crises in real time and with a good lead – Borio states that the most promising leading indicators of financial crises are based on simultaneous positive deviations of the ratio of private sector credit-to-GDP and asset prices, especially property prices, from historical norms.
  5. And it is highly dependent of the financial, monetary and real-economy policy regimes in place (e.g. financial liberalisation under Basel II, monetary policy focussed primarily on inflation targeting and globalisation in the real economy).

Macro economic modelling

Borio also argues that the conventional models used to analyse the economy are deficient because they do not capture the dynamics of the financial cycle. These extracts capture the main points of his critique:

“The notion… of financial booms followed by busts, actually predates the much more common and influential one of the business cycle …. But for most of the postwar period it fell out of favour. It featured, more or less prominently, only in the accounts of economists outside the mainstream (eg, Minsky (1982) and Kindleberger (2000)). Indeed, financial factors in general progressively disappeared from macroeconomists’ radar screen. Finance came to be seen effectively as a veil – a factor that, as a first approximation, could be ignored when seeking to understand business fluctuations … And when included at all, it would at most enhance the persistence of the impact of economic shocks that buffet the economy, delaying slightly its natural return to the steady state …”

“Economists are now trying hard to incorporate financial factors into standard macroeconomic models. However, the prevailing, in fact almost exclusive, strategy is a conservative one. It is to graft additional so-called financial “frictions” on otherwise fully well behaved equilibrium macroeconomic models, built on real-business-cycle foundations and augmented with nominal rigidities. The approach is firmly anchored in the New Keynesian Dynamic Stochastic General Equilibrium (DSGE) paradigm.”

“The purpose of this essay is to summarise what we think we have learnt about the financial cycle over the last ten years or so in order to identify the most promising way forward…. The main thesis is that …it is simply not possible to understand business fluctuations and their policy challenges without understanding the financial cycle”

There is an interesting discussion of the public policy (i.e. prudential, fiscal, monetary) associated with recognising the role of the financial cycle but I will focus on what implications this may have for bank management in general and stress testing in particular.

Insights and questions we can derive from the paper

The observation that financial crises are based on simultaneous positive deviations of the ratio of private sector credit-to-GDP and asset prices, especially property prices, from historical norms covers much the same ground as the Basel Committee’s Countercyclical Capital Buffer (CCyB) and is something banks would already monitor as part of the ICAAP. The interesting question the paper poses for me is the extent to which stress testing (and ICAAP) should focus on a “financial cycle” style disruption as opposed to a business cycle event. Even more interesting is the question of whether the higher severity of the financial cycle is simply an exogenous random variable or an endogenous factor that can be attributed to excessive credit growth. 

I think this matters because it has implications for how banks calibrate their overall risk appetite. The severity of the downturns employed in stress testing has in my experience gradually increased over successive iterations. My recollection is that this has partly been a response to prudential stress tests which were more severe in some respects than might have been determined internally. In the absence of any objective absolute measure of what was severe, it probably made sense to turn up the dial on severity in places to align as far as possible the internal benchmark scenarios with prudential benchmarks such as the “Common Scenario” APRA employs.

At the risk of a gross over simplification, I think that banks started the stress testing process looking at both moderate downturns (e.g. 7-10 year frequency and relatively short duration) and severe recessions (say a 25 year cycle though still relatively short duration downturn). Bank supervisors  in contrast have tended to focus more on severe recession and financial cycle style severity scenarios with more extended durations. Banks’s have progressively shifted their attention to scenarios that are more closely aligned to the severe recession assumed by supervisors in part because moderate recessions tend to be fairly manageable from a capital management perspective.

Why does the distinction between the business cycle and the financial cycle matter?

Business cycle fluctuations (in stress testing terms a “moderate recession”) are arguably an inherent feature of the economy that occur largely independently of the business strategy and risk appetite choices that banks make. However, Borio’s analysis suggests that the decisions that banks make (in particular the rate of growth in credit relative to growth in GDP and the extent to which the extension of bank credit contributes to inflated asset values) do contribute to the risk (i.e. probability, severity and duration) of a severe financial cycle style recession. 

Borio’s analysis also offers a way of thinking about the nature of the recovery from a recession. A moderate business cycle style recession is typically assumed to be short with a relatively quick recovery whereas financial cycle style recessions typically persist for some time. The more drawn out recovery from a financial cycle style recession can be explained by the need for borrowers to deleverage and repair their balance sheets as part of the process of addressing the structural imbalances that caused the downturn.

If the observations above are true, then they suggest a few things to consider:

  • should banks explore a more dynamic approach to risk appetite limits that incorporated the metrics identified by Borio (and also used in the calibration of the CCyB) so that the level of risk they are willing to take adjusts for where they believe they are in the state of the cycle (and which kind of cycle we are in)
  • how should banks think about these more severe financial cycle losses? Their measure of Expected Loss should clearly incorporate the losses expected from business cycle style moderate recessions occurring once every 7-10 years but it is less clear that the kinds of more severe and drawn out losses expected under a Severe Recession or Financial Cycle downturn should be part of Expected Loss.

A more dynamic approach to risk appetite get us into some interesting game theory  puzzles because a decision by one bank to pull back on risk appetite potentially allows competitors to benefit by writing more business and potentially doubly benefiting to the extent that the decision to pull back makes it safer for competitors to write the business without fear of a severe recession (in technical economist speak we have a “collective action” problem). This was similar to the problem APRA faced when it decided to impose “speed limits” on certain types of lending in 2017. The Royal Commission was not especially sympathetic to the strategic bind banks face but I suspect that APRA understand the problem.

How do shareholders think about these business and financial cycle losses? Some investors will adopt a “risk on-risk off” approach in which they attempt to predict the downturn and trade in and out based on that view, other “buy and hold” investors (especially retail) may be unable or unwilling to adopt a trading approach.

The dependence of the financial cycle on the fiscal and monetary policy regimes in place and changes in the real-economy also has potential implications for how banks think about the risk of adverse scenarios playing out. Many of the factors that Borio argues have contributed to the financial cycle (i.e. financial liberalisation under Basel II, monetary policy focussed primarily on inflation targeting and globalisation in the real economy) are reversing (regulation of banks is much more restrictive, monetary policy appears to have recognised the limitations of a narrow inflation target focus and the pace of globalisation appears to be slowing in response to a growing concern that its benefits are not shared equitably). I am not sure exactly what these changes mean other than to recognise that they should in principle have some impact. At a minimum it seems that the pace of credit expansion might be slower in the coming decades than it has in the past 30 years.

All in all, I find myself regularly revisiting this paper, referring to it or employing the distinction between the business and financial cycle. I would recommend it to anyone interested in bank capital management. 

The rise of the normal distribution

“We were all Gaussians now”

This post focuses on a joint paper written in 2012 by Andrew Haldane and Benjamin Nelson titled “Tails of the unexpected”. The topic is the normal distribution which is obviously a bit technical but the paper is still readable even if you are not deeply versed in statistics and financial modelling. The condensed quote below captures the central idea I took away from the paper.

“For almost a century, the world of economics and finance has been dominated by randomness … But as Nassim Taleb reminded us, it is possible to be Fooled by Randomness (Taleb (2001)). For Taleb, the origin of this mistake was the ubiquity in economics and finance of a particular way of describing the distribution of possible real world outcomes. For non-nerds, this distribution is often called the bell-curve. For nerds, it is the normal distribution. For nerds who like to show-off, the distribution is Gaussian.”

The idea that the normal distribution should be used with care, and sometimes not at all, when seeking to analyse economic and financial systems is not news. The paper’s discussion of why this is so is useful if you have not considered the issues before but probably does not offer much new insight if you have.

What I found most interesting was the back story behind the development of the normal distribution. In particular, the factors that Haldane and Nelson believe help explain why it came to be so widely used and misused. Reading the history reminds us of what a cool idea it must have been when it was first discovered and developed.

“By simply taking repeat samplings, the workings of an uncertain and mysterious world could seemingly be uncovered”.
“To scientists seeking to explain the world, the attraction of the normal curve was obvious. It provided a statistical map of a physical world which otherwise appeared un-navigable. It suggested regularities in random real-world data. Moreover, these patterns could be fully described by two simple metrics – mean and variance. A statistical window on the world had been opened.”
Haldane and Nelson highlight a semantic shift in the 1870’s where the term “normal” began to be independently applied to this statistical distribution. They argue that adopting this label helped embed the idea that the “normal distribution” was the “usual” outcome that one should expect to observe. 
“In the 18th century, normality had been formalised. In the 19th century, it was socialised.”
“Up until the late 19th century, no statistical tests of normality had been developed.
Having become an article of faith, it was deemed inappropriate to question the faith.
As Hacking put it, “thanks to superstition, laziness, equivocation, befuddlement with tables of numbers, dreams of social control, and propaganda from utilitarians, the law of large numbers became a synthetic a priori truth. We were all Gaussians now.”

Notwithstanding its widespread use today, in Haldane and Nelson’s account, economics and finance were not early adopters of the statistical approach to analysis but eventually become enthusiastic converts. The influence of physics on the analytical approaches employed in economics is widely recognised and Haldane cites the rise of probability based quantum physics over old school deterministic Newtonian physics as one of the factors that prompted economists to embrace probability and the normal distribution as a key tool.

” … in the early part of the 20th century, physics was in the throes of its own intellectual revolution. The emergence of quantum physics suggested that even simple systems had an irreducible random element. In physical systems, Classical determinism was steadily replaced by statistical laws. The natural world was suddenly ruled by randomness.”
“Economics followed in these footsteps, shifting from models of Classical determinism to statistical laws.”
“Whether by accident or design, finance theorists and practitioners had by the end of the 20th century evolved into fully paid-up members of the Gaussian sect.”

Assessing the Evidence

Having outlined the story behind its development and increasingly widespread use, Haldane and Nelson then turn to the weight of evidence suggesting that normality is not a good statistical description of real-world behaviour. In its place, natural and social scientists have often unearthed behaviour consistent with an alternative distribution, the so-called power law distribution.
“In consequence, Laplace’s central limit theorem may not apply to power law-distributed variables. There can be no “regression to the mean” if the mean is ill-defined and the variance unbounded. Indeed, means and variances may then tell us rather little about the statistical future. As a window on the world, they are broken”
This section of the paper probably does not introduce anything new to people who have spent any time looking at financial models. It does however beg some interesting questions. For example, to what extent bank loan losses are better described by a power law and, if so, what does this mean for the measures of expected loss that are employed in banking and prudential capital requirements; i.e. how should banks and regulators respond if “…the means and variances … tell us rather little about the statistical future”? This is particularly relevant as banks transition to Expected Loss accounting for loan losses.
We can of course estimate the mean loss under the benign part of the credit cycle but it is much harder to estimate a “through the cycle” average (or “expected” loss) because the frequency, duration and severity of the cycle downturn is hard to pin down with any precision. We can use historical evidence to get a sense of the problem; we can for example talk about moderate downturns say every 7-10 years with more severe recessions every 25-30 years and a 75 year cycle for financial crises. However the data is obviously sparse so it does not allow the kind of precision that is part and parcel of normally distributed events.

Explaining Fat Tails

The paper identifies the following drivers behind non-normal outcomes:
  • Non- Linear dynamics
  • Self organised criticality
  • Preferential attachment
  • Highly optimised tolerance
The account of why systems do not conform to the normal distribution does not offer much new but I found reading it useful for reflecting on the practical implications. One of the items they called out is competition which is typically assumed by economists to be a wholly benign force. This is generally true but Haldane and Nelson note the capacity for competition to contribute to self-organised criticality.
Competition in finance and banking can of course lead to beneficial innovation and efficiency gains but it can also contribute to progressively increased risk taking (e.g. more lax lending standards, lower margins for tail risk) thereby setting the system up to be prone to a self organised critical state. Risk based capital requirements can also contribute to self organised criticality to the extent they facilitate increased leverage and create incentives to take on tail risk.

Where Next?

Haldane and Nelson add their voice to the idea that Knight’s distinction between risk and uncertainty is a good foundation for developing better ways of dealing with a world that does not conform to the normal distribution and note the distinguishied company that have also chosen to emphasise the importance of uncertainty and the limitations of risk.
“Many of the biggest intellectual figures in 20th century economics took this distinction seriously. Indeed, they placed uncertainty centre-stage in their policy prescriptions. Keynes in the 1930s, Hayek in the 1950s and Friedman in the 1960s all emphasised the role of uncertainty, as distinct from risk, when it came to understanding economic systems. Hayek criticised economics in general, and economic policymakers in particular, for labouring under a “pretence of knowledge.”
Assuming that the uncertainty paradigm was embraced, Haldane and Nelson consider what the practical implications would be. They have a number of proposals but I will focus on these
  • agent based modelling
  • simple rather than complex
  • don’t aim to smooth out all volatility

Agent based modelling

Haldane and Nelson note that …

In response to the crisis, there has been a groundswell of recent interest in modelling economic and financial systems as complex, adaptive networks. For many years, work on agent-based modelling and complex systems has been a niche part of the economics and finance profession. The crisis has given these models a new lease of life in helping explain the discontinuities evident over recent years (for example, Kirman (2011), Haldane and May (2011))
In these frameworks, many of the core features of existing models need to be abandoned.
  • The “representative agents” conforming to simple economic laws are replaced by more complex interactions among a larger range of agents
  • The single, stationary equilibrium gives way to Lorenz-like multiple, non-stationary equilibria.
  • Linear deterministic models are usurped by non linear tipping points and phase shifts
Haldane and Nelson note that these types of systems are already being employed by physicists, sociologists, ecologists and the like. Since the paper was written (2012) we have seen some evidence that economists are experimenting with “agent based modelling”. A paper by Richard Bookstabber offers a useful outline of his efforts to apply these models and he has also written a book (“The End of Theory”) promoting this path. There is also a Bank of England paper on ABM worth looking at.
I think there is a lot of value in agent based modelling but a few things impede their wider use. One is that the models don’t offer the kinds of precision that make the DSGE and VaR models so attractive. The other is that they require a large investment of time to build and most practitioners are fully committed just keeping the existing models going. Finding the budget to pioneer an alternative path is not easy. These are not great arguments in defence of the status quo but they do reflect certain realities of the world in which people work.

Simple can be more robust than complex

Haldane and Nelson also advocate simplicity in lieu of complexity as a general rule of thumb for dealing with an uncertain world.
The reason less can be more is that complex rules are less robust to mistakes in specification. They are inherently fragile. Harry Markowitz’s mean-variance optimal portfolio model has informed millions of investment decisions over the past 50 years – but not, interestingly, his own. In retirement, Markowitz instead used a much simpler equally-weighted asset approach. This, Markowitz believed, was a more robust way of navigating the fat-tailed uncertainties of investment returns (Benartzi and Thaler (2001)).
I am not a big fan of the Leverage Ratio they cite it as one example of regulators beginning to adopt simpler approaches but the broader principle that simple is more robust than complex does ring true.
The mainstay of regulation for the past 30 years has been more complex estimates of banks’ capital ratios. These are prone to problems of highly-optimised tolerance. In part reflecting that, regulators will in future require banks to abide by a far simpler backstop measure of the leverage ratio. Like Markowitz’s retirement portfolio, this equally-weights the assets in a bank’s portfolio. Like that portfolio, it too will hopefully be more robust to fat-tailed uncertainties.
Structural separation is another simple approach to the problem of making the system more resilient
A second type of simple, yet robust, regulatory rule is to impose structural safeguards on worst-case outcomes. Technically, this goes by the name of a “minimax” strategy (Hansen and Sargent (2011)). The firebreaks introduced into some physical systems can be thought to be playing just this role. They provide a fail-safe against the risk of critical states emerging in complex systems, either in a self-organised manner or because of man-made intervention. These firebreak-type approaches are beginning to find their way into the language and practice of regulation.
And a reminder about the dangers of over engineering
Finally, in an uncertain world, fine-tuned policy responses can sometimes come at a potentially considerable cost. Complex intervention rules may simply add to existing uncertainties in the system. This is in many ways an old Hayekian lesson about the pretence of knowledge, combined with an old Friedman lesson about the avoidance of policy harm. It has relevance to the (complex, fine-tuned) regulatory environment which has emerged over the past few years.
While we can debate the precise way to achieve simplicity, the basic idea does in my view have a lot of potential to improve the management of risk in general and bank capital in particular. Complex intervention rules may simply add to existing uncertainties in the system and the current formulation of how the Capital Conservation Ratio interacts with the Capital Conservation Buffer is a case in point. These two elements of the capital adequacy framework define what percentage of a bank’s earnings must be retained if the capital adequacy ratio is under stress.
In theory the calculation should be simple and intuitive but anyone who has had to model how these rules work under a stress scenario will know how complex and unintuitive the calculation actually is. The reasons why this is so are probably a bit too much detail for today but I will try to pick this topic up in a future post.

Don’t aim to eliminate volatility

Systems which are adapted to volatility will tend to be stronger than systems that are sheltered from it, or in the words of Haldane and Nelson …

“And the argument can be taken one step further. Attempts to fine-tune risk control may add to the probability of fat-tailed catastrophes. Constraining small bumps in the road may make a system, in particular a social system, more prone to systemic collapse. Why? Because if instead of being released in small bursts pressures are constrained and accumulate beneath the surface, they risk an eventual volcanic eruption.”

I am a big fan of this idea. Nassim Taleb makes a similar argument in his book “Antifragile” as does Greg Ip in “Foolproof”. It also reflects Nietzsche’s somewhat more poetic dictum “that which does not kills us makes us stronger”.

In conclusion

If you have read this far then thank you. I hope you found it useful and interesting. If you want to delve deeper then you can find my more detailed summary and comments on the paper here. If you think I have any of the above wrong then please let me know.

Swiss money experiment

Last month I posted a review of Mervyn King’s book “The end of Alchemy”. One of the central ideas in King’s book was that all deposits must be backed 100% by liquid, safe assets. It appears that the Swiss are being asked to vote on a proposal labeled “Sovereign Money Initiative” that may not be exactly the same as King’s idea but comes from the same school of money philosophy.

It is not clear that there is any popular support for the proposal but it would be a fascinating money experiment if it did get support. Thanks to Brian Reid for flagging this one to me.

Tony

 

 

Looking under the hood – The IRB formula

This post is irredeemably technical so stop here if that is not your interest. If you need to understand some of the mechanics of the formula used to calculate credit risk weighted assets under the advanced Internal Ratings Based (IRB) approach then the BCBS published a paper in 2005 which offers an explanation:

  • describing the economic foundations
  • as well as the underlying mathematical model and its input parameters.

While a lot has changed as a result of Basel III, the models underlying the calculation of Internal Rating Based Capital (IRB) requirements are still based on the core principles agreed under Basel II that are explained in this BCBS paper.

The notes in the linked page below mostly summarise the July 2005 paper with some emphasis (bolded text) and comments (in italics) that I have added. The paper is a bit technical but worth reading if you want to understand the original thinking behind the Basel II risk weights for credit risk.

I initially found the paper useful for revisiting the foundation assumptions of the IRB framework as background to considering the regulatory treatment of Expected Loss as banks transition to IFRS9. The background on how the RW was initially intended to cover both Expected and Unexpected Loss, but was revised such that capital was only required to cover Unexpected Loss, is especially useful when considering the interaction of loan loss provisioning with capital requirements.

Reading the BCBS paper has also been useful for thinking through a range of related issues including:

  • The rationale for, and impact of, prudential conservatism in setting the risk parameters used in the IRB formula
  • The cyclicality of a risk sensitive capital requirement (and potential for pro cyclicality) and what might be done to mitigate the risk of pro-cyclical impacts on the economy

If you have read this far then my summary of the BCBS paper and my comments /observations can be found here (and thank you).

I am not a credit risk model expert, so the summary of the paper and my comments must be read with that in mind. I did this to help me think through some of the issues with bank capital adequacy. Hopefully others will find the notes useful. If you see something wrong or something you disagree with then let me know.

“The Great Divide” by Andrew Haldane

This speech by Andrew Haldane (Chief Economist at the Bank of England) was given in 2016 but is sill worth reading for anyone interested in the question of what role banks play in society and why their reputation is not what it once was. Some of my long term correspondents will be familiar with the paper and may have seen an earlier draft of this post.

“The Great Divide” refers to a gap between how banks perceive themselves and how they are perceived by the community. Haldane references a survey the BOE conducted in which the most common word used by banks to describe themselves was “regulated” while “corrupt” was the community choice closely followed by “manipulated”, “self-serving”, “destructive” and “greedy”. There is an interesting “word cloud” chart in the paper representing this gap in perception.

While the focus is on banks, Haldane makes the point that the gap in perceptions reflects a broader tension between the “elites” and the common people. He does not make this explicit connection but it seemed to me that the “great divide” he was referencing could also be argued to be manifesting itself in the increasing support for populist political figures purporting to represent the interests of the common people against career politicians. This broader “great divide” idea seemed to me to offer a useful framework for thinking about the challenges the banking industry is facing in rebuilding trust.

Haldane uses this “great divide” as a reference for discussing

  • The crucial role finance plays in society
  • The progress made so far in restoring trust in finance
  • What more needs to be done

The crucial role finance plays in society

Haldane argues that closing the trust deficit between banks and society matters for two reasons

  • because a well functioning financial system is an essential foundation for a growing and well functioning economy – to quote Haldane “that is not an ideological assertion from the financial elite; it is an empirical fact”
  • but also because the downside of a poorly functioning financial system is so large

Haldane uses the GFC to illustrate the downside in terms of the destruction of the value of financial capital and physical capital but he introduces a third form of capital, “social capital” that he argues may matter every bit as much to the wealth and well being of society. He defines social capital as the “relationships, trust and co-operation forged between different groups of people over time. It is the sociological glue that binds diverse societies into a cohesive whole”. The concept of “trust” is at the heart of Haldane’s definition of social capital.

Haldane cites evidence that trust plays an important role at both the micro and macro level in value creation and growth and concludes that “… a lack of trust jeopardises one of finance’s key societal functions – higher growth”.

In discussing these trends, Haldane distinguishes “personalised trust” and “generalised trust“. The former refers to mutual co-operation built up through repeated personal interactions (Haldane cites example like visits to the doctor or hairdresser) while the latter is attached to an identifiable but anonymous group (Haldane cites trust in the rule of law, or government or Father Christmas).

He uses this distinction to explore why banks have lost the trust of the community;

He notes that banking was for most of its history a relationship based business. The business model was not perfect but it did deliver repeated interactions with customers that imbued banking with personalised trust. At the same time its “mystique” (Haldane’s term) meant that banking maintained a high degree of generalised trust as well.

He cites the reduction in local branches, a common strategy pre GFC, as one of the changes that delivered lower costs but reduced personal connections thereby contributing to reducing personalised trust. For a while, the banking system could reap the efficiency gains while still relying on generalised trust but the GFC subsequently undermined the generalised trust in the banking system. This generalised trust has been further eroded by the continued run of banking scandals that convey the sense that banks do not care about their customers.

What can be done to restore trust in finance

He notes the role that higher capital and liquidity have played but that this is not enough in his view. He proposes three paths

  1. Enhanced public education
  2. Creating “Purpose” in banking
  3. Communicating “Purpose” in banking

Regarding public education, there is a telling personal anecdote he offers on his experience with pensions. He describes himself as “moderately financially literate” but follows with “Yet I confess to not being able to make the remotest sense of pensions. Conversations with countless experts and independent financial advisors have confirmed for me only one thing – that they have no clue either”. This may be dismissed as hyperbole but it does highlight that most people will be less financially literate than Haldane and are probably poorly equipped to deal with the financial choices they are required to make in modern society. I am not sure that education is the whole solution.

Regarding “purpose” Haldane’s main point seems to be that there is too much emphasis on shareholder value maximisation and not enough balance. This seems to be an issue that is amplified by the UK Companies Act that requires that directors place shareholder interests as their primary objective. To the best of my knowledge, the Australian law does not have an equivalent explicit requirement to put shareholders first but we do grapple with the same underlying problem. Two of my recent posts (“The World’s Dumbest Idea” and “The Moral Economy” touch on this issue.

Regarding communicating purpose, Haldane cites some interesting evidence that the volume of information provided by companies is working at cross purposes with actual communication with stakeholders. Haldane does not make the explicit link but Pillar 3 clearly increases the volume of information provided by banks. The points raised by Haldane imply (to me at least) that Pillar 3 might actually be getting in the way of communicating clearly with stakeholders.

This is a longish post but I think there is quite a lot of useful content in the speech so I would recommend it.

“Between Debt and the Devil: Money, Credit and Fixing Global Finance” by Adair Turner (2015)

This book is worth reading, if only because it challenges a number of preconceptions that bankers may have about the value of what they do. The book also benefits from the fact that author was the head of the UK Financial Services Authority during the GFC and thus had a unique inside perspective from which to observe what was wrong with the system. Since leaving the FSA, Turner has reflected deeply on the relationship between money, credit and the real economy and argues that, notwithstanding the scale of change flowing from Basel III, more fundamental change is required to avoid a repeat of the cycle of financial crises.

Overview of the book’s main arguments and conclusions

Turner’s core argument is that increasing financial intensity, represented by credit growing faster than nominal GDP, is a recipe for recurring bouts of financial instability.

Turner builds his argument by first considering the conventional wisdom guiding much of bank prudential regulation prior to GFC, which he summarises as follows:

  • Increasing financial activity, innovation and “financial deepening” were beneficial forces to be encouraged
  • More compete and liquid markets were believed to ensure more efficient allocation of capital thereby fostering higher productivity
  • Financial innovations made it easier to provide credit to households and companies thereby enabling more rapid economic growth
  • More sophisticated risk measurement and control meanwhile ensured that the increased complexity of the financial system was not achieved at the expense of stability
  • New systems of originating and distributing credit, rather than holding it on bank balance sheets, were believed to disperse risks into the hands of those best placed to price and manage it

Some elements of Turner’s account of why this conventional wisdom was wrong do not add much to previous analysis of the GFC. He notes, for example, the conflation of the concepts of risk and uncertainty that weakened the risk measurement models the system relied on and concludes that risk based capital requirements should be foregone in favour of a very high leverage ratio requirement. However, in contrast to other commentators who attribute much of the blame to the moral failings of bankers, Turner argues that this is a distraction. While problems with the way that bankers are paid need to be addressed, Turner argues that the fundamental problem is that:

  • modern financial systems left to themselves inevitably create debt in excessive quantities,
  • in particular, the system tends to create debt that does not fund new capital investment but rather the purchase of already existing assets, above all real estate.

Turner argues that the expansion of debt funding the purchase or trading of existing assets drives financial booms and busts, while the debt overhang left over by the boom explains why financial recovery from a financial crisis is typically anaemic and protracted. Much of this analysis seems to be similar to ideas developed by Hyman Minsky while the slow pace of recovery in the aftermath of the GFC reflects a theme that Reinhart and Rogoff have observed in their book titled “This time is different” which analyses financial crises over many centuries.

The answer, Turner argues, is to build a less credit intensive growth model. In pursuing this goal, Turner argues that we also need to understand and respond to the implications of three underlying drivers of increasing credit intensity;

  1. the increasing importance of real estate in modern economies,
  2. increasing inequality, and
  3. global current account imbalances.

Turner covers a lot of ground, and I do not necessarily agree with everything in his book, but I do believe his analysis of what is wrong with the system is worth reading.

Let me start with an argument I do not find compelling; i.e. that risk based capital requirements are unreliable because they are based on a fundamental misunderstanding of the difference between risk (which can be measured) and uncertainty (which cannot):

  • Distinguishing between risk and uncertainty is clearly a fundamental part of understanding risk and Turner is not alone in emphasising its importance
  • I believe that means that we should treat risk based capital requirements with a healthy degree of scepticism and a clear sense of their limitations but that does not render them entirely unreliable especially when we are using them to understand relative differences in risk and to calibrate capital buffers
  • The obvious problem with non-risk based capital requirements is that they create incentives for banks to take higher risk that may eventually offset the supposed increase in soundness attached to the higher capital
  • It may be that Turner discounts this concern because he envisages a lower credit growth/intensity economy delivering less overall systemic risk or because he envisages a more active role for the public sector in what kinds of assets banks lend against; i.e. his support for higher capital may stem mostly from the fact that this reduces the capacity of private banks to generate credit growth

While advocating much higher capital, Turner does seem to part company with M&M purists by expressing doubt that equity investors will be willing to accept deleveraged returns. His reasoning is that returns to equity investments need a certain threshold return to be “equity like” while massively deleveraged ROE still contains downside risks that are unacceptable to debt investors.

Turning to the arguments which I think raise very valid concerns and deserve serious attention.

Notwithstanding my skepticism regarding a leverage ratio as the solution, the arguments he makes about the dangers of excessive credit growth resonate very strongly with what I learned during my banking career. Turner is particularly focussed on the downsides of applying excessive debt to the financing of existing assets, real estate in particular. The argument seems to be similar to (if not based on) the work of Hyman Minsky.

Turner’s description of the amount of money that banks can create as being “infinitely elastic” seems an overstatement to me (especially in the Australian context with the Net Stable Funding Ratio (NSFR) weighing on the capacity to grow the balance sheet) but the general point he is making about the way that credit fuelled demand for a relatively inelastic supply of desirable residential property tends to result in inflated property values with no real social value rings true.

What banks can do about this remains an open question given that resolving the problem with inelastic supply of property is outside their direct control but it is obviously important to understand the dynamics of the market underpinning their largest asset class and it may help them engage more constructively with public policy debates that seek to address the problem.

Turner’s analysis of the downsides of easy monetary policy (the standard response to economic instability) also rings true. He identifies the fact that lower interest rates tend to result in inflated asset values (residential property in particular given its perceived value as a safe asset) which do not address the fundamental problem of over-indebtedness and may serve to increase economic inequality. His discussion of the impact of monetary policy and easy credit on economic inequality is also interesting. The banks providing the credit in the easy money environment may not necessarily be taking undue risk and prudential supervisors have tools to ensure sound lending standards are maintained if they do believe there is a problem with asset quality. What may happen however is that the wealthier segments of society benefit the most under easy money because they have the surplus cash flow to buy property at inflated values while first homebuyers become squeezed out of the market. Again their capacity to address the problem may be limited but Turner’s analysis prompted me to reflect on what increasing economic inequality might mean for bank business models.

In addition to much higher bank capital requirements, Turner’s specific recommendations for moving towards a less credit intensive economy include:

  • Government policies related to urban development and the taxation of real estate
  • Changing tax regimes to reduce the current bias in favour of debt over equity financing (note that Australia is one of the few countries with a dividend imputation system that does reduce the bias to debt over equity)
  • Broader macro prudential powers for central banks, including the power to impose much larger countercyclical capital requirements
  • Tough constraints on the ability of the shadow banking system to create credit and money equivalents
  • Using public policy to produce different allocations of capital than would result from purely market based decisions; in particular, deliberately leaning against the market signal based bias towards real estate and instead favouring other “potentially more socially valuable forms of credit allocation”
  • Recognising that the traditional easy monetary policy response to an economic downturn (or ultra-easy in the case of a financial crisis such as the GFC) is better than doing nothing but comes at a cost of reigniting the growth in private credit that generated the initial problem, creating incentives for risky financial engineering and exacerbating economic inequality via inflating asset prices.

For those who want to dig deeper, I have gone into a bit more detail here on what Turner has to say about the following topics:

  • The way in which inefficient and irrational markets leave the financial system prone to booms and busts
  • The dangers of debt contracts sets out how certain features of these contracts increase the risk of instability and hamper the recovery
  • Too much of the wrong sort of debt describes features of the real estate market that make it different from other asset classes
  • Liberalisation, innovation and the credit cycle on steroids recaps on the philosophy that drove the deregulation of financial markets and what Turner believes to be the fundamental flaws with that approach. In particular his conclusion that the amount of credit created and its allocation is “… too important to be left to bankers…”
  • Private credit and money creation offers an outline of how bank deposits evolved to play an increasing role (the key point being that it was a process of evolution rather than overt public policy design choices)
  • Credit financed speculation discusses the ways in which credit in modern economies tends to be used to finance the purchase of existing assets, in particular real estate, and the issues that flow from this.
  • Inequality, credit and more inequality sets out some ways in which the extension of credit can contribute to increasing economic inequality
  • Capital requirements sets out why Turner believes capital requirements should be significantly increased and why capital requirements (i.e. risk weights) for some asset classes (e.g. real estate) should be be calibrated to reflect the social risk of the activity and not just private risks captured by bank risk models
  • Turner defence against the argument that his proposals are anti-markets and anti-growth.

“The World’s Dumbest Idea” by James Montier of GMO.

Anyone interested in the question of shareholder value will I think find this paper by James Montier interesting.

The focus of the paper is to explore problems with elevating Shareholder Value to be the primary objective of a firm. Many companies are trying to achieve a more balanced approach but the paper is still useful background given that some investors appear to believe that shareholder value maximisation is the only valid objective a company should pursue. The paper also touches on the question of how increasing inequality is impacting the environment in which we operate.

While conceding that the right incentives can prompt better performance, JM argues that there is a point where increasing the size of the reward actually leads to worse performance;

“From the collected evidence on the psychology of incentives, it appears that when incentives get too high people tend to obsess about them directly, rather than on the task in hand that leads to the payout. Effectively, high incentives divert attention away from where it should be”

The following extracts will give you a sense of the key points and whether you want to read the paper itself.

  • “Let’s now turn to the broader implications and damage done by the single-minded focus on SVM. In many ways the essence of the economic backdrop we find ourselves facing today can be characterized by three stylized facts: 1) declining and low rates of business investment; 2) rising inequality; and 3) a low labour share of GDP (evidenced by Exhibits 7 through 9).” — Page 7 —
  • “This preference for low investment tragically “makes sense” given the “alignment” of executives and shareholders. We should expect SVM to lead to increased payouts as both the shareholders have increased power (inherent within SVM) and the managers will acquiesce as they are paid in a similar fashion. As Lazonick and Sullivan note, this led to a switch in modus operandi from “retain and reinvest” during the era of managerialism to “downsize and distribute” under SVM.” — Page 9 —
  • “This diversion of cash flows to shareholders has played a role in reducing investment. A little known fact is that almost all investment carried out by firms is financed by internal sources (i.e., retained earnings). Exhibit 13 shows the breakdown of the financing of gross investment by source in five-year blocks since the 1960s. The dominance of internal financing is clear to see (a fact first noted by Corbett and Jenkinson in 1997”— Page 10 —
  • “The obsession with returning cash to shareholders under the rubric of SVM has led to a squeeze on investment (and hence lower growth), and a potentially dangerous leveraging of the corporate sector” — Page 11 —
  • “The problem with this (apart from being an affront to any sense of fairness) is that the 90% have a much higher propensity to consume than the top 10%. Thus as income (and wealth) is concentrated in the hands of fewer and fewer, growth is likely to slow significantly. A new study by Saez and Zucman (2014) … shows that 90% have a savings rate of effectively 0%, whilst the top 1% have a savings rate of 40%…. ultimately creating a fallacy of composition where they are undermining demand for their own products by destroying income).” —Page 13 —
  • “Only by focusing on being a good business are you likely to end up delivering decent returns to shareholders. Focusing on the latter as an objective can easily undermine the former. Concentrate on the former, and the latter will take care of itself.” — Page 14 —
  • “… management guru Peter Drucker was right back in 1973 when he suggested “The only valid purpose of a firm is to create a customer.”” — Page 14 —

People want money

This post draws on a FT article titled “People want money” which led me to an interesting paper by Gary Gorton and George Pennacchi titled “Financial Intermediaries and Liquidity Creation”.  I took the following points away from the Gorton/Pennacchi paper:

  • The modern financial markets based economy relies on “money” to facilitate the bulk of its economic activity and bank deposits are the dominant form of money
  • There is however a continuous search for ways to expand the domain of what matches the liquidity of “money” while offering a better return
  • History has seen a variety of instruments and commodities operate as money but a critical issue is whether they retain their “moneyness” during adverse economic conditions (I think this is something that the crypto currency advocates don’t seem to fully grasp)
  • Gorton/Pennacchi argue that the liquidity of an instrument and hence its capacity to be accepted and used as money depends on the ability of uninformed agents to trade it without fear of loss; i.e. the extent to which the value of the instrument is insulated from any adverse information about the counterparty – This I think is their big idea
  • The role of a bank has traditionally been characterised as one of credit intermediation between savers and borrowers but Gorton/Pennacchi argue that the really critical role of banks is to provide a liquid asset in the form of bank deposits that serves as a form of money
  • Note that other functions offered by banks can be replicated by non-banks (e.g. non-banks are increasingly providing payment functions for customers and offering loans)  but the capacity to issue liabilities that serve as money is unique to banks
  • The challenge is that banks tend to hold risky assets and to be opaque which undermines the liquidity of bank deposits/money (as an aside, Gorton/Pennacchi offer some interesting historical context in which opacity was useful because people trusted banks and the opacity helped shield them from any information which might undermine this trust)
  • There are a variety of ways to make bank deposits liquid in the sense that Gorton/Pennacchi define it (i.e. insensitive to adverse information about the bank) but they argue for solutions where depositors have a sufficiently deep and senior claim on the assets of the bank that any volatility in their value is of no concern to them
  • This of course is what deposit insurance and giving deposits a preferred claim in the bank loss hierarchy does (note that the insured deposit a preferred claim on a bank’s assets also means the government can underwrite deposit insurance with very little risk of loss)
  • A lot of the regulatory change we have seen to date (more equity, less short term funding) contribute to that outcome without necessarily being expressed in terms of improving the liquidity of bank deposits in the way Gorton/Pennacchi frame the desired outcome

A lot of the above is not necessarily new but I do see some interesting connections with the role of banks in the money creation process and how this influences the debate about what is the optimum capital structure for a bank

  • It has been argued that more (and more) equity is a costless solution to the problem of how much is enough because the cost of equity will decline as the percentage of equity in the balance sheet increases
  • This conclusion depends in turn on the Modigliani and Miller (M&M) thesis that the value of a firm is independent of its financing structure
  • The Money Creation analysis however shows that banks are in fact unique (amongst private companies) in that one of the things they produce is money (or bank deposits to be more precise) – Gorton/Pennacchi explicitly call this out as a factor that means that M&M does not apply to banks in the simplistic way proponents of very high capital assert (most other critiques of higher bank capital just focus on the general limitations of M&M)
  • If you accept Gorton/Pennacchi’s argument that bank deposits need to be risk free in the minds of the users if they are to serve as money (the argument makes sense to me) then it follows that the cost of deposits does not change incrementally with changes in the financing structure in the way that M&M assume
  • In practice, bank deposits are either assumed to be risk free or they lose that risk free status – the risk trade-off is binary – one or the other, but not a smooth continuum assumed by M&M
  • That implies that all the real risk in a bank balance sheet has to reside in other parts of the loss hierarchy (i.e. equity, other loss absorbing capital and senior instruments)
  • And this will be even more so under Basel III because the government is developing the capacity to impose losses on all these stakeholders without having to resort to a formal bankruptcy and liquidation process (i.e. via bail-in and TLAC)
  •  Critics of bail-in argue that you can’t impose losses on liabilities but here I think they are conflating what you can’t do to depositors (where I would very much agree) with what can and does happen to bondholders relatively frequently
  • Bondholders have faced losses of principal lending to a range of counterparties (including sovereigns) so I don’t see why banks should be special in this regard – what matters is that bondholders understand the risk and price it appropriately (including not lending as much as they might otherwise have done)
  • I would also argue that imposing the risk of bail in onto bondholders is likely to be a much more effective risk discipline than requiring banks to hold arbitrarily large equity holdings that mean they struggle to earn an adequate equity like return

Gorton and Pennacchi’s paper did not explicitly raise this point but I also see an interesting connection with the Basel III Net Stable Funding Ratio (NSFR) requirement that does not get much attention;

  • The NSFR places great value on having a high level of depositor funding but, the greater the share of deposits in the liability stack, the more exposed those deposits are to any volatility in the value of the bank’s assets
  • So holding too many deposits might in fact be counterproductive and less resilient than an alternative structure in which there is slightly more long term wholesale funding and less retail deposits
  • This line of analysis also calls into question the logic underpinning the Open Bank Resolution regime in NZ where deposits can be bailed in pro rata with senior unsecured liabilities
  • The NZ regime allows some de minimis value of deposits to be excluded from bail in but there is no depositor preference such as Australia has under the Banking Act
  • The RBNZ seems to assume that applying market discipline to deposits is desirable on Moral Hazard grounds but Gorton/Pennacchi’s thesis seems to me to imply the exact opposite

Tell me what I am missing …