A sceptical look at ESG

Anyone with more than a casual interest in business will be familiar with the increased focus on Environmental, Social and Governance (ESG) issues. There are sound arguments being made on both sides of the debate but I will admit upfront that I approach the topic with a somewhat ESG positive bias. Given my bias, it is all the more important to pay attention to what the sceptics are calling out rather than looking for affirmation amongst the true believers.

A post by Aswath Damodaran titled “Sounding good or Doing good? A Skeptical Look at ESG” is one of the better contributions to the ESG debate that I have encountered. I discussed one of his earlier contributions to the debate here and it is clear that he is not a fan of ESG. I am still working through his arguments but I like the analytical framework he employs and the way in which he supports his arguments with evidence.

I intend to do a couple of posts digging down into the ESG debate using Damodaran’s post and few other sources but want to start by laying out his arguments with some very limited comments.

Damodaran starts by framing ESG as part of a tradition of business ideas that have tended to prove to be more noise than substance, describing the ESG “sales pitch” as follows

“Companies that improve their social goodness standing will not only become more profitable and valuable over time, we are told, but they will also advance society’s best interests, thus resolving one of the fundamental conflicts of private enterprise, while also enriching investors”

There is no doubt that ESG, like many other business ideas, is prone to being over-hyped. There is room to take issue with the question of whether this is a fair description of the ESG movement as a whole. My gut feel is that presenting the “sales pitch” version is not representative of ESG advocates who genuinely believe that ESG can address problems in the ways the market currently operate, but it will be more productive to focus on the specific weaknesses that Damodaran discusses.

Damodaran starts with the problem of measurement

“Any attempts to measure environment and social goodness face two challenges. 

– The first is that much of social impact is qualitative, and developing a numerical value for that impact is difficult to do. 

– The second is even trickier, which is that there is little consensus on what social impacts to measure, and the weights to assign to them.”  

Assuming the measurement issues can be resolved, the second problem is identifying exactly how incorporating ESG factors into the business model or strategy contributes to improving the value of a company. Damodaran uses the following generic model of value drivers to explore this question

Figure 1: The Drivers of Value

Using this framework, Damodaran identifies two ways in which a company can derive benefits from incorporating ESG principles into its business strategy

  1. Goodness is rewarded – i.e. companies behave in a socially responsible way because it creates positive outcomes for their business
  2. Badness is punished – i.e. companies behave in a socially responsible way because bad behaviour is punished

Damodaran also identifies a third scenario in which “The bad guys win”

“In this scenario, bad companies mouth platitudes about social responsibility and environmental consciousness without taking any real action, but customers buy their products and services, either because they are cheaper or because of convenience, employees continue to work for them because they can earn more at these companies or have no options, and investors buy their shares because they deliver higher profits. As a result, bad companies may score low on corporate responsibility scales, but they will score high on profitability and stock price performance.”

Damodaran argues that the evidence supports the following conclusions:

  1. A weak link to profitability

“There are meta studies (summaries of all other studies) that  summarize hundreds of ESG research papers, and find a small positive link between ESG and profitability, but one that is very sensitive to how profits are measured and over what period, leading one of these studies to conclude that “citizens looking for solutions from any quarter to cure society’s pressing ills ought not appeal to financial returns alone to mobilize corporate involvement”. Breaking down ESG into its component parts, some studies find that environment (E) offered the strongest positive link to performance and social (S) the weakest, with governance (G) falling in the middle.”

2) A stronger link to funding costs

Studies of “sin” stocks, i.e., companies involved in businesses such as producing alcohol, tobacco, and gaming, find that these stocks are less commonly held by institutions, and that they face higher costs for funding, from equity and debt). The evidence for this is strongest in sectors like tobacco (starting in the 1990s) and fossil fuels (especially in the last decade), but these findings come with a troubling catch. While these companies face higher costs, and have lower value, investors in these companies will generate higher returns from holding these stocks.”

3) Some evidence that ESG focussed companies do reduce their risk of failure or exposure to disaster risk

“An alternate reason why companies would want to be “good” is that “bad” companies are exposed to disaster risks, where a combination of missteps by the company, luck, and a failure to build in enough protective controls (because they cost too much) can cause a disaster, either in human or financial terms. That disaster can not only cause substantial losses for the company, but the collateral reputation damage created can have long term consequences. One study created a value-weighted portfolio of controversial firms that had a history of violating ESG rules, and reported negative excess returns of 3.5% on this portfolio, even after controlling for risk, industry, and company characteristics. The conclusion in this study was that these lower excess returns are evidence that being socially irresponsible is costly for firms, and that markets do not fully incorporate the consequences of bad corporate behavior. The push back from skeptics is that not all firms that behave badly get embroiled in controversy, and it is possible that looking at just firms that are controversial creates a selection bias that explains the negative returns.”

Damodaran sums up his argument

“There is a weak link between ESG and operating performance (growth and profitability), and while some firms benefit from being good, many do not. Telling firms that being socially responsible will deliver higher growth, profits and value is false advertising. The evidence is stronger that bad firms get punished, either with higher funding costs or with a greater incidence of disasters and shocks. ESG advocates are on much stronger ground telling companies not to be bad, than telling companies to be good. In short, expensive gestures by publicly traded companies to make themselves look “good” are futile, both in terms of improving performance and delivering returns.”

There is a lot more to say on this topic. The evidence that certain types of companies do get punished for failing to be socially responsible is especially interesting. I see a fair degree of cynicism applied to the ESG stance adopted by the Australia banks but I suspect they are a good example of the type of company that will in fact benefit from making real investments in socially responsible business strategies.

Tony – From the Outside

What does the “economic perspective” add to an ICAAP?

… the question I reflected on as I read the ECB Report on Banks’ ICAAP Practices (August 2020).

That I should be asking the question is even more curious given the years I spent working with economic capital but there was something in the ECB position that I was not comfortable with. There is nothing particularly wrong in the ways that the ECB envisages that an economic perspective can add value to a bank’s ICAAP. The problem (for me), I came to realise, is more the lack of emphasis on recognising the fundamental limitations of economic models. In short, my concern is that the detailed focus on risk potentially comes at the expense of an equally useful consideration of the ways in which a bank is subject to radical uncertainty.

The rest of this post offers an overview of what the ECB survey observed and some thoughts on the value of explicitly incorporating radical uncertainty into an ICAAP.

The ECB report sample set

The ECB report, based on a survey of 37 significant institutions it supervises, assesses the extent to which these organisations were complying (as at April 2019) with ECB expectations for how the ICAAP should be constructed and executed. The selected sample focuses on the larger (and presumably more sophisticated) banks, including all global systematically important banks supervised by the ECB. I am straying outside my area of expertise (Australian bank capital management) in this post but there is always something to learn from considering another perspective.

The ECB assessment on ICAAP practices

The ECB notes that progress has been made in some areas of the ICAAP. In particular; all banks in the survey have risk identification processes in place, they produce summary documents (“Capital Adequacy Statements” in ECB parlance) that enable bank management (not just the technical specialists) to engage with and take responsibility for the capital strength of their bank and the sample banks do incorporate stress testing into their capital planning process.

The ECB believes however that there is still a lot of room for improvement. The general area of concern is that the banks it supervises are still not paying sufficient attention to the question of business continuity. The ECB cites three key areas as being particularly in need of improvement if the ICAAPs are to play their assigned role in effectively contributing to a bank’s continuity:

  1. Data quality
  2. The application of the “Economic Perspective” in the ICAAP
  3. Stress testing

The value of building the ICAAP on sound data and testing the outcomes of the process under a variety of severe stress scenarios is I think uncontentious.

The value the economic perspective contributes is less black and white. Like many thing in life, the challenge is to get the balance right. My perspective is that economic models are quite useful but they are far from a complete answer and dangerous when they create an illusion of knowledge, certainty and control.

The economic internal perspective

The ECB’s guide to the ICAAP defines the term “economic internal perspective” as follows:

“Under this perspective, the institution’s assessment is expected to cover the full universe of risks that may have a material impact on its capital position from an economic perspective. In order to capture the undisguised economic situation, this perspective is not based on accounting or regulatory provisions. Rather, it should take into account economic value considerations for all economically relevant aspects, including assets, liabilities and risks. …. The institution is expected to manage economic risks and assess them as part of its stress-testing framework and its monitoring and management of capital adequacy”

ECB Guide to the internal capital adequacy assessment process (ICAAP) – Principles, November 2018 (Paragraph 49 / pages 18-19)

So far so good – the key points seem (to me) to be quite fair as statements of principle.

The ECB sees value in looking beyond the accounting and regulatory measures that drive the reported capital ratios (the “normative perspective” in ECB terminology) and wants banks to consider “the full universe of risks that may have a material impact on its capital position”. The ECB Report also emphasises the importance of thinking about capital from a “business continuity” perspective and cites the “… unjustified inclusions of certain capital components (e.g. minority interests, Additional Tier 1 … or Tier 2 … instruments) … which can inflate the internal capital figures” as evidence of banks failing to meet this expectation. Again a fair point in my view.

These are all worthy objectives but I wonder

  • firstly about the capacity of economic capital models to reliably deliver the kinds of insights the ECB expects and
  • secondly whether there are more cost effective ways to achieve similar outcomes.

The value of a different perspective

As a statement of principle, the value of bringing a different perspective to bear clearly has value. The examples that the ECB cites for ways in which the economic perspective can inform and enhance the normative perspective are all perfectly valid and potentially useful. My concern is that the ECB seems to be pursuing an ideal state in which an ICAAP can, with sufficient commitment and resources, achieve a degree of knowledge that enables a bank to control its future.

Business continuity is ultimately founded on a recognition that there are limits to what we can know about the future and I side with the risk philosophy that no amount of analysis will fundamentally change this.

The ECB’s economic perspective does not neccesarily capture radical uncertainty

I have touched on the general topic of uncertainty and what it means for the ICAAP a couple of times in this blog. The ECB report mentions “uncertainty” twice; once in the context of assessing climate change risk

Given the uncertainty surrounding the timing of climate change and its negative consequences, as well as the potentially far-reaching impact in breadth and magnitude along several transmission channels via which climate-related risks may impact banks’ capital adequacy, it is rather concerning that almost one-third of the banks has not even considered these risks in their risk identification processes at all.

Page 39

… and then in the context of making allowances for data quality

However, … in an internal deep dive on risk quantification in 2019, half of the risk quantifications showed material deficiencies. This finding is exacerbated by the data quality issues generally observed and moreover by the fact that one-half of the banks does not systematically ensure that the uncertainty surrounding the accuracy of risk quantifications (model risk) is appropriately addressed by an increased level of conservatism. 

Page 54

This is not a question of whether we should expect that banks can demonstrate that they are thinking about climate change and making allowances for model risk along with a host of other plausible sources of adverse outcomes. It is a surprise that any relatively large and sophisticated banks might be found wanting in the ways in which these risks are being assessed and the ECB is right to call the out.

However, it is equally surprising (for me at least) that the ECB did not seem to see value in systematically exploring the extent to which the ICAAPs of the banks it supervises deal with the potential for radical uncertainty.

Business continuity is far more likely if banks can also demonstrate that they recognise the limits of what they can know about the future and actively plan to deal with being surprised by the unexpected. In short one of the key ICAAP practices I would be looking for is evidence that banks have explicitly made allowances for the potential for their capital plan to have to navigate and absorb “unknown unknowns”.

For what it is worth, my template for how a bank might make explicit allowances in the ICAAP for unknown unknowns is included in this post on the construction of calibration of cyclical capital buffers. My posts on the broader issue of risk versus uncertainty can be found on the following links:

Feel free to let me know what I am missing …

Tony – From the Outside

Worth reading – “Radical Uncertainty: Decision-Making for an Unknowable Future” by John Kay and Mervyn King

I have covered some of the ideas in the book in previous posts (here and here) but have now had the chance the read the book in full and can recommend it. I have included more detailed notes on the book here but this post offers a short introduction to some of the key ideas.

Kay and King cover a lot of ground but, simply put, their book is about

“… how real people make choices in a radically uncertain world, in which probabilities cannot meaningfully be attached to alternative futures.” 

One of the things that makes the book interesting is that they were once true believers in decision making models based on rational economic agents seeking to maximise or optimise expected value.

As students and academics we pursued the traditional approach of trying to understand economic behaviour through the assumption that households, businesses, and indeed governments take actions in order to optimise outcomes. We learnt to approach economic problems by asking what rational individuals were maximising. Businesses were maximising shareholder value, policy-makers were trying to maximise social welfare, and households were maximising their happiness or ‘utility’. And if businesses were not maximising shareholder value, we inferred that they must be maximising something else – their growth, or the remuneration of their senior executives.

The limits on their ability to optimise were represented by constraints: the relationship between inputs and outputs in the case of businesses, the feasibility of different policies in the case of governments, and budget constraints in the case of households. This ‘optimising’ description of behaviour was well suited to the growing use of mathematical techniques in the social sciences. If the problems facing businesses, governments and families could be expressed in terms of well-defined models, then behaviour could be predicted by evaluating the ‘optimal’ solution to those problems.

Kay and King are not saying that these models are useless. They continue to see some value in the utility maximisation model but have come to believe that it is not the complete answer that many economists, finance academics and politicians came to believe.

Although much can be learnt by thinking in this way, our own practical experience was that none of these economic actors were trying to maximise anything at all. This was not because they were stupid, although sometimes they were, nor because they were irrational, although sometimes they were. It was because an injunction to maximise shareholder value, or social welfare, or household utility, is not a coherent guide to action.

They argue that the approach works up to a point but fails to deal with decisions that are in the domain of radical uncertainty

But we show in this book that the axiomatic approach to the definition of rationality comprehensively fails when applied to decisions made by businesses, governments or households about an uncertain future. And this failure is not because these economic actors are irrational, but because they are rational, and – mostly – do not pretend to knowledge they do not and could not have. Frequently they do not know what is going to happen and cannot successfully describe the range of things that might happen, far less know the relative likelihood of a variety of different possible events.

There are many factors that explain the current state of affairs but a key inflexion point in Kay and King’s account can be found in what they label “A Forgotten Dispute” (Chapter 5) between Frank Knight and John Maynard Keynes on one side and Frank Ramsey and Bruno de Frinetti on the other, regarding the distinction between risk and uncertainty. Knight and Keynes argued that probability is an objective concept confined to problems with a defined and knowable frequency distribution. Ramsey argued that “subjective probability” is equally valid and used the mathematics developed for the analysis of frequency based probabilities to apply these subjective probabilities.

“Economists (used to) distinguish risk, by which they meant unknowns which could be described with probabilities, from uncertainty, which could not….. over the last century economists have attempted to elide that historic distinction between risk and uncertainty, and to apply probabilities to every instance of our imperfect knowledge of the future.”

Keynes and Knight lost the debate

Ramsey and de Finetti won, and Keynes and Knight lost, that historic battle of ideas over the nature of uncertainty. The result was that the concept of radical uncertainty virtually disappeared from the mainstream of economics for more than half a century. The use of subjective probabilities, and the associated mathematics, seemed to turn the mysteries of radical uncertainty into puzzles with calculable solutions. 

Ramsey and de Finetti laid the foundations for economists to expand the application of probability based thinking and decision making. Milton Friedman picked up the baton and ran with it.

There is a lot more to the book than interesting historical anecdotes on the history of economic ideas. The subject matter is rich and it crosses over topics covered previously in this blog including:

There are also overlaps with a book by Richard Bookstaber titled “The End of Theory: Financial Crises, the Failure of Economics, and the Sweep of Human Interaction”. I am yet to review this book but have some detailed notes here.

One quibble with the book is that I think their critique of the Bayesian method is a bit harsh. I understand their concern to push back on the idea that Bayes solves the problem of using probability to understand uncertainty. At times however it reads like Bayes has no value at all. Read “The Theory that Would Not Die: How Bayes’ Rule Cracked the Enigma Code, Hunted Down Russian Submarines, and Emerged Triumphant from Two Centuries of Controversy” by Sharon Bertsch McGrayne for an alternative perspective.

Bayes may not help with mysteries but its application in puzzles should not be undervalued. I don’t entirely agree with their perspective on behavioural finance either.

I want to come back to the topics of risk and uncertainty in a future post but it will take time to process all of the overlapping pieces. In the interim, I hope you found the overview above useful.

Tony (From the Outside)

Probabilities disguising uncertainty

In this situation, what you started getting was probabilities that disguised uncertainty as opposed to actually providing you with more useful information.

Barack Obama commenting on making the decision whether to attack a target which evidence suggested could be Osama Bin Laden

This quote is a drawn from an article that John Kay published on his website under the title “The point of probabilities”. The point he is making is

  • Similar to one touched on in a Bank Underground post that I discussed in a recent post on my blog.
  • Short and worth reading

Tony

Distinguishing luck and skill

Quantifying Luck’s Role in the Success Equation

“… we vastly underestimate the role of luck in what we see happening around us”

This post is inspired by a recent read of Michael Mauboussin’s book “The Success Equation: Untangling Skill and Luck in Business, Sports and Investing”. Mauboussin focuses on the fact that much of what we experience is a combination of skill and luck but we tend to be quite bad at distinguishing the two. It may not unlock the secret to success but, if you want to get better at untangling the contributions that skill and luck play in predicting or managing future outcomes, then this book still has much to offer.

“The argument here is not that you can precisely measure the contributions of skill and luck to any success or failure. But if you take concrete steps toward attempting to measure those relative contributions, you will make better decisions than people who think improperly about those issues or who don’t think about them at all.”

Structure wise, Mauboussin:

  • Starts with the conceptual foundations for thinking about the problem of distinguishing skill and luck,
  • Explores the analytical tools we can use to figure out the extent to which luck contributes to our achievements, successes and failures,
  • Finishes with some concrete suggestions about how to put the conceptual foundations and analytical tools to work in dealing with luck in decisions.

Conceptual foundations

It is always good to start by defining your terms; Mauboussin defines luck and skill as follows:

“Luck is a chance occurrence that affects a person or a group.. [and] can be good or bad [it] is out of one’s control and unpredictable”

Skill is defined as the “ability to use one’s knowledge effectively and readily in execution or performance.”

Applying the process that Mauboussin proposes requires that we first roughly distinguish where a specific activity or prediction fits on the continuum bookended by skill and luck. Mauboussin also clarifies that:

  • Luck and randomness are related but not the same: He distinguishes luck as operating at the level of the individual or small group while randomness operates at the level of the system where more persistent and reliable statistical patterns can be observed.
  • Expertise does not necessarily accumulate with experience: It is often assumed that doing something for a long time is sufficient to be an expert but Mauboussin argues that in activities that depend on skill, real expertise only comes about via deliberate practice based on improving performance in response to feedback on the ways in which the input generates the predicted outcome.

Mauboussin is not necessarily introducing anything new in his analysis of why we tend to bad at distinguishing skill and luck. The fact that people tend to struggle with statistics is well-known. The value for me in this book lies largely in his discussion of the psychological dimension of the problem which he highlights as exerting the most profound influence. The quote below captures an important insight that I wish I understood forty years ago.

“The mechanisms that our minds use to make sense of the world are not well suited to accounting for the relative roles that skill and luck play in the events we see taking shape around us.”

The role of ideas, beliefs and narratives is a recurring theme in Mauboussin’s analysis of the problem of distinguishing skill and luck. Mauboussin notes that people seem to be pre-programmed to want to fit events into a narrative based on cause and effect. The fact that things sometimes just happen for no reason is not a satisfying narrative. We are particularly susceptible to attributing successful outcomes to skill, preferably our own, but we seem to be willing to extend the same presumption to other individuals who have been successful in an endeavour. It is a good story and we love stories so we suppress other explanations and come to see what happened as inevitable.

Some of the evidence we use to create these narratives will be drawn from what happened in specific examples of the activity, while we may also have access to data averaged over a larger sample of similar events. Irrespective, we seem to be predisposed to weigh the specific evidence more heavily in our intuitive judgement than we do the base rate averaged over many events (most likely based on statistics we don’t really understand). That said, statistical evidence can still be “useful” if it “proves” something we already believe; we seem to have an intuitive bias to seek evidence that supports what we believe. Not only do we fail to look for evidence that disproves our narrative, we tend to actively suppress any contrary evidence we encounter.

Analytical tools for navigating the skill luck continuum

We need tools and processes to help manage the tendency for our intuitive judgements to lead us astray and to avoid being misled by arguments that fall into the same trap or, worse, deliberately exploit these known weaknesses in our decision-making process.

One process proposed by Mauboussin for distinguishing skill from luck is to:

  • First form a generic judgement on what the expected accuracy of our prediction is likely to be (i.e. make a judgement on where the activity sits on the skill-luck continuum)
  • Next look at the available empirical or anecdotal evidence, distinguishing between the base rate for this type of activity (if it exists) and any specific evidence to hand
  • Then employ the following rule:
    • if the expected accuracy of the prediction is low (i.e. luck is likely to be a significant factor), you should place most of the weight on the base rate
    • if the expected accuracy is high (i.e. there is evidence that skill plays the prime role in determining the outcome of what you are attempting to predict), you can rely more on the specific case.
  • use the data to test if the activity conforms to your original judgement of how skill and luck combine to generate the outcomes

Figuring out where the activity sits on the skill-luck continuum is the critical first step and Mauboussin offers three methods for undertaking this part of the process: 1) The “Three Question” approach, 2) Simulation and 3) True Score Theory. I will focus here on the first method which involves

  1. First ask if you can easily assign a cause to the effect you are seeking to predict. In some instances the relationship will be relatively stable and linear (and hence relatively easy to predict) whereas the results of other activities are shaped by complex dependencies such as cumulative advantage and social preference. Skill can play a part in both activities but luck is likely to be a more significant factor in the latter group.
  2. Determining the rate of reversion to the mean: Slow reversion is consistent with activities dominated by skill, while rapid reversion comes from luck being the more dominant influence. Note however that complex activities where cumulative advantage and social preference shape the outcome may not have a well-defined mean to revert to. The distribution of outcomes for these activities frequently conform to a power law (i.e. there are lots of small values and relatively few large values).
  3. Is there evidence that expert prediction is useful? When experts have wide disagreement and predict poorly, that is evidence that luck is a prime factor shaping outcomes.

One of the challenges with this process is to figure out how large a sample size you need to determine if there is a reliable relationship between actions and outcome that evidences skill.  Another problem is that a reliable base rate may not always be available. That may be because the data has just not been collected but also because a reliable base rate simply may not even exist.

The absence of a reliable base rate to guide decisions is a feature of activities that do not have simple linear relationships between cause and effect. These activities also tend to fall into Nassim Taleb’s “black swan” domain. The fundamental lesson in this domain of decision making is to be aware of the risks associated with naively applying statistical probability based methods to the problem. Paul Wilmott and David Orrell use the idea of a “zone of validity” to make the same point in “The Money Formula”.

The need to understand power laws and the mechanisms that generate them also stands out in Mauboussin’s discussion of untangling skill and luck.

The presence of a power law depends in part on whether events are dependent on, or independent of, one another. In dependent systems, initial conditions matter and come to matter more and more as time goes on. The final outcomes are (sometimes surprisingly) sensitive to both minor variations in the initial conditions and to the path taken over time. Mauboussin notes that a number of mechanisms are responsible for this phenomenon including preferential attachment, critical points and phase transitions are also crucial.

“In some realms, independence and bell-shaped distributions of luck can explain much of what we see. But in activities such as the entertainment industry, success depends on social interaction. Whenever people can judge the quality of an item by several different criteria and are allowed to influence one another’s choices, luck will play a huge role in determining success or failure.”

“For example, if one song happens to be slightly more popular than another at just the right time, it will tend to become even more popular as people influence one another. Because of that effect, known as cumulative advantage, two songs of equal quality, or skill, will sell in substantially different numbers. …  skill does play a role in success and failure, but it can be overwhelmed by the influence of luck. In the jar model, the range of numbers in the luck jar is vastly greater than the range of numbers in the skill jar.”

“The process of social influence and cumulative advantage frequently generates a distribution that is best described by a power law.”

“The term power law comes from the fact that an exponent (or power) determines the slope of the line. One of the key features of distributions that follow a power law is that there are very few large values and lots of small values. As a result, the idea of an “average” has no meaning.”

Mauboussin’s discussion of power laws does not offer this specific example but the idea that the average is meaningless is also true of loan losses when you are trying to measure expected loss over a full loan loss cycle. What we tend to observe is lots of relatively small values when economic conditions are benign and a few very large losses when the cycle turns down, probably amplified by endogenous factors embedded in bank balance sheets or business models. This has interesting and important implications for the concept of Expected Loss which is a fundamental component of the advanced Internal Rating Based approach to bank capital adequacy measurement.

Mauboussin concludes with a list of ten suggestions for untangling and navigating the divide between luck and skill:

  1. Understand where you are on the luck skill continuum
  2. Assess sample size, significance and swans
  3. Always consider a null hypothesis – is there some evidence that proves that my base  belief is wrong
  4. Think carefully about feedback and rewards; High quality feedback is key to high performance. Where skill is more important, then deliberate practice is essential to improving performance. Where luck plays a strong role, the focus must be on process
  5. Make use of counterfactuals; To maintain an open mind about the future, it is very useful to keep an open mind about the past. History is a narrative of cause and effect but it is useful to reflect on how outcomes might have been different.
  6. Develop aids to guide and improve your skill; On the luck side of the continuum, skill is still relevant but luck makes the outcomes more probabilistic. So the focus must be on good process – especially one that takes account of behavioural biases. In the middle of the spectrum, the procedural is combined with the novel. Checklists can be useful here – especially when decisions must be made under stress. Where skill matters, the key is deliberate practice and being open to feedback
  7. Have a plan for strategic interactions. Where your opponent is more skilful or just stronger, then try to inject more luck into the interaction
  8. Make reversion to the mean work for you; Understand why reversion to the mean happens, to what degree it happens, what exactly the mean is. Note that extreme events are unlikely to be repeated and most importantly, recognise that the rate of reversion to the mean relates to the coefficient of correlation
  9. Develop useful statistics (i.e.stats that are persistent and predictive)
  10. Know your limitations; we can do better at untangling skill and luck but also must recognise how much we don’t know. We must recognise that the realm may change such that old rules don’t apply and there are places where statistics don’t apply

All in all, I found Maubossin’s book very rewarding and can recommend it highly. Hopefully the above post does the book justice. I have also made some more detailed notes on the book here.

Tony

The rise of the normal distribution

“We were all Gaussians now”

This post focuses on a joint paper written in 2012 by Andrew Haldane and Benjamin Nelson titled “Tails of the unexpected”. The topic is the normal distribution which is obviously a bit technical but the paper is still readable even if you are not deeply versed in statistics and financial modelling. The condensed quote below captures the central idea I took away from the paper.

“For almost a century, the world of economics and finance has been dominated by randomness … But as Nassim Taleb reminded us, it is possible to be Fooled by Randomness (Taleb (2001)). For Taleb, the origin of this mistake was the ubiquity in economics and finance of a particular way of describing the distribution of possible real world outcomes. For non-nerds, this distribution is often called the bell-curve. For nerds, it is the normal distribution. For nerds who like to show-off, the distribution is Gaussian.”

The idea that the normal distribution should be used with care, and sometimes not at all, when seeking to analyse economic and financial systems is not news. The paper’s discussion of why this is so is useful if you have not considered the issues before but probably does not offer much new insight if you have.

What I found most interesting was the back story behind the development of the normal distribution. In particular, the factors that Haldane and Nelson believe help explain why it came to be so widely used and misused. Reading the history reminds us of what a cool idea it must have been when it was first discovered and developed.

“By simply taking repeat samplings, the workings of an uncertain and mysterious world could seemingly be uncovered”.
“To scientists seeking to explain the world, the attraction of the normal curve was obvious. It provided a statistical map of a physical world which otherwise appeared un-navigable. It suggested regularities in random real-world data. Moreover, these patterns could be fully described by two simple metrics – mean and variance. A statistical window on the world had been opened.”
Haldane and Nelson highlight a semantic shift in the 1870’s where the term “normal” began to be independently applied to this statistical distribution. They argue that adopting this label helped embed the idea that the “normal distribution” was the “usual” outcome that one should expect to observe. 
“In the 18th century, normality had been formalised. In the 19th century, it was socialised.”
“Up until the late 19th century, no statistical tests of normality had been developed.
Having become an article of faith, it was deemed inappropriate to question the faith.
As Hacking put it, “thanks to superstition, laziness, equivocation, befuddlement with tables of numbers, dreams of social control, and propaganda from utilitarians, the law of large numbers became a synthetic a priori truth. We were all Gaussians now.”

Notwithstanding its widespread use today, in Haldane and Nelson’s account, economics and finance were not early adopters of the statistical approach to analysis but eventually become enthusiastic converts. The influence of physics on the analytical approaches employed in economics is widely recognised and Haldane cites the rise of probability based quantum physics over old school deterministic Newtonian physics as one of the factors that prompted economists to embrace probability and the normal distribution as a key tool.

” … in the early part of the 20th century, physics was in the throes of its own intellectual revolution. The emergence of quantum physics suggested that even simple systems had an irreducible random element. In physical systems, Classical determinism was steadily replaced by statistical laws. The natural world was suddenly ruled by randomness.”
“Economics followed in these footsteps, shifting from models of Classical determinism to statistical laws.”
“Whether by accident or design, finance theorists and practitioners had by the end of the 20th century evolved into fully paid-up members of the Gaussian sect.”

Assessing the Evidence

Having outlined the story behind its development and increasingly widespread use, Haldane and Nelson then turn to the weight of evidence suggesting that normality is not a good statistical description of real-world behaviour. In its place, natural and social scientists have often unearthed behaviour consistent with an alternative distribution, the so-called power law distribution.
“In consequence, Laplace’s central limit theorem may not apply to power law-distributed variables. There can be no “regression to the mean” if the mean is ill-defined and the variance unbounded. Indeed, means and variances may then tell us rather little about the statistical future. As a window on the world, they are broken”
This section of the paper probably does not introduce anything new to people who have spent any time looking at financial models. It does however beg some interesting questions. For example, to what extent bank loan losses are better described by a power law and, if so, what does this mean for the measures of expected loss that are employed in banking and prudential capital requirements; i.e. how should banks and regulators respond if “…the means and variances … tell us rather little about the statistical future”? This is particularly relevant as banks transition to Expected Loss accounting for loan losses.
We can of course estimate the mean loss under the benign part of the credit cycle but it is much harder to estimate a “through the cycle” average (or “expected” loss) because the frequency, duration and severity of the cycle downturn is hard to pin down with any precision. We can use historical evidence to get a sense of the problem; we can for example talk about moderate downturns say every 7-10 years with more severe recessions every 25-30 years and a 75 year cycle for financial crises. However the data is obviously sparse so it does not allow the kind of precision that is part and parcel of normally distributed events.

Explaining Fat Tails

The paper identifies the following drivers behind non-normal outcomes:
  • Non- Linear dynamics
  • Self organised criticality
  • Preferential attachment
  • Highly optimised tolerance
The account of why systems do not conform to the normal distribution does not offer much new but I found reading it useful for reflecting on the practical implications. One of the items they called out is competition which is typically assumed by economists to be a wholly benign force. This is generally true but Haldane and Nelson note the capacity for competition to contribute to self-organised criticality.
Competition in finance and banking can of course lead to beneficial innovation and efficiency gains but it can also contribute to progressively increased risk taking (e.g. more lax lending standards, lower margins for tail risk) thereby setting the system up to be prone to a self organised critical state. Risk based capital requirements can also contribute to self organised criticality to the extent they facilitate increased leverage and create incentives to take on tail risk.

Where Next?

Haldane and Nelson add their voice to the idea that Knight’s distinction between risk and uncertainty is a good foundation for developing better ways of dealing with a world that does not conform to the normal distribution and note the distinguishied company that have also chosen to emphasise the importance of uncertainty and the limitations of risk.
“Many of the biggest intellectual figures in 20th century economics took this distinction seriously. Indeed, they placed uncertainty centre-stage in their policy prescriptions. Keynes in the 1930s, Hayek in the 1950s and Friedman in the 1960s all emphasised the role of uncertainty, as distinct from risk, when it came to understanding economic systems. Hayek criticised economics in general, and economic policymakers in particular, for labouring under a “pretence of knowledge.”
Assuming that the uncertainty paradigm was embraced, Haldane and Nelson consider what the practical implications would be. They have a number of proposals but I will focus on these
  • agent based modelling
  • simple rather than complex
  • don’t aim to smooth out all volatility

Agent based modelling

Haldane and Nelson note that …

In response to the crisis, there has been a groundswell of recent interest in modelling economic and financial systems as complex, adaptive networks. For many years, work on agent-based modelling and complex systems has been a niche part of the economics and finance profession. The crisis has given these models a new lease of life in helping explain the discontinuities evident over recent years (for example, Kirman (2011), Haldane and May (2011))
In these frameworks, many of the core features of existing models need to be abandoned.
  • The “representative agents” conforming to simple economic laws are replaced by more complex interactions among a larger range of agents
  • The single, stationary equilibrium gives way to Lorenz-like multiple, non-stationary equilibria.
  • Linear deterministic models are usurped by non linear tipping points and phase shifts
Haldane and Nelson note that these types of systems are already being employed by physicists, sociologists, ecologists and the like. Since the paper was written (2012) we have seen some evidence that economists are experimenting with “agent based modelling”. A paper by Richard Bookstabber offers a useful outline of his efforts to apply these models and he has also written a book (“The End of Theory”) promoting this path. There is also a Bank of England paper on ABM worth looking at.
I think there is a lot of value in agent based modelling but a few things impede their wider use. One is that the models don’t offer the kinds of precision that make the DSGE and VaR models so attractive. The other is that they require a large investment of time to build and most practitioners are fully committed just keeping the existing models going. Finding the budget to pioneer an alternative path is not easy. These are not great arguments in defence of the status quo but they do reflect certain realities of the world in which people work.

Simple can be more robust than complex

Haldane and Nelson also advocate simplicity in lieu of complexity as a general rule of thumb for dealing with an uncertain world.
The reason less can be more is that complex rules are less robust to mistakes in specification. They are inherently fragile. Harry Markowitz’s mean-variance optimal portfolio model has informed millions of investment decisions over the past 50 years – but not, interestingly, his own. In retirement, Markowitz instead used a much simpler equally-weighted asset approach. This, Markowitz believed, was a more robust way of navigating the fat-tailed uncertainties of investment returns (Benartzi and Thaler (2001)).
I am not a big fan of the Leverage Ratio they cite it as one example of regulators beginning to adopt simpler approaches but the broader principle that simple is more robust than complex does ring true.
The mainstay of regulation for the past 30 years has been more complex estimates of banks’ capital ratios. These are prone to problems of highly-optimised tolerance. In part reflecting that, regulators will in future require banks to abide by a far simpler backstop measure of the leverage ratio. Like Markowitz’s retirement portfolio, this equally-weights the assets in a bank’s portfolio. Like that portfolio, it too will hopefully be more robust to fat-tailed uncertainties.
Structural separation is another simple approach to the problem of making the system more resilient
A second type of simple, yet robust, regulatory rule is to impose structural safeguards on worst-case outcomes. Technically, this goes by the name of a “minimax” strategy (Hansen and Sargent (2011)). The firebreaks introduced into some physical systems can be thought to be playing just this role. They provide a fail-safe against the risk of critical states emerging in complex systems, either in a self-organised manner or because of man-made intervention. These firebreak-type approaches are beginning to find their way into the language and practice of regulation.
And a reminder about the dangers of over engineering
Finally, in an uncertain world, fine-tuned policy responses can sometimes come at a potentially considerable cost. Complex intervention rules may simply add to existing uncertainties in the system. This is in many ways an old Hayekian lesson about the pretence of knowledge, combined with an old Friedman lesson about the avoidance of policy harm. It has relevance to the (complex, fine-tuned) regulatory environment which has emerged over the past few years.
While we can debate the precise way to achieve simplicity, the basic idea does in my view have a lot of potential to improve the management of risk in general and bank capital in particular. Complex intervention rules may simply add to existing uncertainties in the system and the current formulation of how the Capital Conservation Ratio interacts with the Capital Conservation Buffer is a case in point. These two elements of the capital adequacy framework define what percentage of a bank’s earnings must be retained if the capital adequacy ratio is under stress.
In theory the calculation should be simple and intuitive but anyone who has had to model how these rules work under a stress scenario will know how complex and unintuitive the calculation actually is. The reasons why this is so are probably a bit too much detail for today but I will try to pick this topic up in a future post.

Don’t aim to eliminate volatility

Systems which are adapted to volatility will tend to be stronger than systems that are sheltered from it, or in the words of Haldane and Nelson …

“And the argument can be taken one step further. Attempts to fine-tune risk control may add to the probability of fat-tailed catastrophes. Constraining small bumps in the road may make a system, in particular a social system, more prone to systemic collapse. Why? Because if instead of being released in small bursts pressures are constrained and accumulate beneath the surface, they risk an eventual volcanic eruption.”

I am a big fan of this idea. Nassim Taleb makes a similar argument in his book “Antifragile” as does Greg Ip in “Foolproof”. It also reflects Nietzsche’s somewhat more poetic dictum “that which does not kills us makes us stronger”.

In conclusion

If you have read this far then thank you. I hope you found it useful and interesting. If you want to delve deeper then you can find my more detailed summary and comments on the paper here. If you think I have any of the above wrong then please let me know.

“The End of Alchemy” by Mervyn King

Anyone interested in the conceptual foundations of money and banking will I think find this book interesting. King argues that the significant enhancements to capital and liquidity requirements implemented since the GFC are not sufficient because of what he deems to be fundamental design flaws in the modern system of money and banking.

King is concerned with the process by which bank lending creates money in the form of bank deposits and with the process of maturity transformation in banking under which long term, illiquid assets are funded to varying degrees by short term liabilities including deposits. King applies the term “alchemy” to these processes to convey the sense that the value created is not real on a risk adjusted basis.

He concedes that there will be a price to pay in foregoing the “efficiency benefits of financial intermediation” but argues that these benefits come at the cost of a system that:

  • is inherently prone to banking crises because, even post Basel III, it is supported by too little equity and too little liquidity, and
  • can only be sustained in the long run by the willingness of the official sector to provide Lender of Last Resort liquidity support.

King’s radical solution is that all deposits must be 100% backed by liquid reserves which would be limited to safe assets such as government securities or reserves held with the central bank. King argues that this removes the risk/incentive for bank runs and for those with an interest in Economic History he acknowledges that this idea originated with “many of the most distinguished economists of the first half the twentieth century” who proposed an end to fractional reserve banking under a proposal that was known as the “Chicago Plan”. Since deposits are backed by safe assets, it follows that all other assets (i.e. loans to the private sector) must be financed by equity or long term debt

The intended result is to separate

  • safe, liquid “narrow” banks issuing deposits and carrying out payment services
  • from risky, illiquid “wide” banks performing all other activities.

At this point, King notes that the government could in theory simply stand back and allow the risk of unexpected events to impact the value of the equity and liabilities of the banks but he does not advocate this. This is partly because volatility of this nature can undermine consumer confidence but also because banks may be forced to reduce their lending in ways that have a negative impact on economic activity. So some form of central bank liquidity support remains necessary.

King’s proposed approach to central bank liquidity support is what he colloquially refers to as a “pawnbroker for all seasons” under which the  central bank agrees up front how much it will lend each bank against the collateral the bank can offer;

King argues that

“almost all existing prudential capital and liquidity regulation, other than a limit on leverage, could be replaced by this one simple rule”.

which “… would act as a form of mandatory insurance so that in the event of a crisis a central bank would be free to lend on terms already agreed and without the necessity of a penalty rate on its loans. The penalty, or price of the insurance, would be encapsulated by the haircuts required by the central bank on different forms of collateral”

leaving banks “… free to decide on the composition of their assets and liabilities… all subject to the constraint that alchemy in the private sector is eliminated”

Underpinning King’s thesis are four concepts that appear repeatedly

  • Disequilibrium; King explores ways in which economic disequilibrium repeatedly builds up followed by disruptive change as the economy rebalances
  • Radical uncertainty; this is the term he applies to Knight’s concept of uncertainty as distinct from risk. He uses this to argue that any risk based approach to capital adequacy is not built on sound foundations because it will not capture the uncertain dimension of unexpected loss that we should be really concerned with
  • The “prisoner’s dilemma” to illustrate the difficulty of achieving the best outcome when there are obstacles to cooperation
  • Trust; he sees trust as the key ingredient that makes a market economy work but also highlights how fragile that trust can be.

My thoughts on King’s observations and arguments

Given that King headed the Bank of England during the GFC, and was directly involved in the revised capital and liquidity rules (Basel III) that were created in response, his opinions should be taken seriously. It is particularly interesting that, notwithstanding his role in the creation of Basel III, he argues that a much more radical solution is required.

I think King is right in pointing out that the banking system ultimately relies on trust and that this reliance in part explains why the system is fragile. Trust can and does disappear, sometimes for valid reasons but sometimes because fear simply takes over even when there is no real foundation for doubting the solvency of the banking system. I think he is also correct in pointing out that a banking system based on maturity transformation is inherently illiquid and the only way to achieve 100% certainty of liquidity is to have one class of safe, liquid “narrow” banks issuing deposits and another class of risky, illiquid institution he labels “wide” banks providing funding on a maturity match funded basis. This second class of funding institution would arguably not be a bank if we reserve that term for institutions which have the right to issue “bank deposits”.

King’s explanation of the way bank lending under the fractional reserve banking system creates money covers a very important aspect of how the modern banking and finance system operates. This is a bit technical but I think it is worth understanding because of the way it underpins and shapes so much of the operation of the economy. In particular, it challenges the conventional thinking that banks simply mobilise deposits. King explains how banks do more than just mobilise a fixed pool of deposits, the process of lending in fact creates new deposits which add to the money supply. For those interested in understanding this in more depth, the Bank of England published a short article in its Quarterly Bulletin (Q1 2014) that you can find at the following link

He is also correct, I think, in highlighting the limits of what risk based capital can achieve in the face of “radical uncertainty” but I don’t buy his proposal that the leverage ratio is the solution. He claims that his “pawnbroker for all seasons” approach is different from the standardised approach to capital adequacy but I must confess I can’t see that the approaches are that different. So even if you accept his argument that internal models are not a sound basis for regulatory capital, I would still argue that a revised and well calibrated standardised approach will always be better than a leverage ratio.

King’s treatment of the “Prisoner’s Dilemma” in money and banking is particularly interesting because it sets out a conceptual rationale for why markets will not always produce optimal outcomes when there are obstacles to cooperation. This brings to mind Chuck Prince’s infamous statement about being forced to “keep dancing while the music is playing” and offers a rationale for the role of regulation in helping institutions avoid situations in which competition impedes the ability of institutions to avoid taking excessive risk. This challenges the view that market discipline would be sufficient to keep risk taking in check. It also offers a different perspective on the role of competition in banking which is sometimes seen by economists as a panacea for all ills.

I have also attached a link to a review of King’s book by Paul Krugman