One question asked afterwards was "what can we do as far as regulations to cope with these kinds of instabilities?" As I recall, my answer was pretty lame. I tried to suggest (vaguely) that the answer probably lies not with highly complex regulations, but with simpler ones, but I didn't say much more. I've been thinking about that point since, and thought it might be worth writing a few things down.
Essentially, the first lesson I think we should draw, once we acknowledge the existence of pervasive instabilities in finance, is the need to deal with persisting uncertainty. We will never understand the terrain so well that we can reduce the future to a set of known possibilities to which we can assign specific probabilities (which standard economics assumes we can). We can and should work hard to explore the space of what might happen, and so gain some forewarning of dangers, but we will still encounter surprises and we should expect to do so. So our approach to regulation ought to be centered on that premise -- that we face a world of uncertainty.
I should have read Andrew Haldane's wonderful essay The Dog and the Frisbee before giving my talk, but I only got around to that this morning. He makes some hugely important points on this very topic. The essay is one extended argument for why financial regulation is now too complex, and why our best hope at achieving financial stability in the future probably lies in a vast simplification of the regulatory apparatus and system of rules.
He begins from the observation that, in many settings where decision making involves weighing up many conflicting factors, simple rules often out-perform more complex ones:
Among physicians diagnosing heart attacks, simple decision trees beat a complex model. Among detectives locating serial criminals, simple locational rules trump complex psychological profiling. Among investors picking stocks, simple passive strategies outperform complex active ones. And among shopkeepers understanding spending patterns, repeat purchase data out-predict complex models.Haldane goes on at length to consider this in the context of financial regulation, where the legal framework has really exploded in complexity over the past few decades. The pursuit of ever-more complex models to assess risks, whether used by regulators or by banks and financial institutions independently, has led to a proliferation of models that create an overwhelming fog of complexity and can blind us all to obvious risks:
The general message here is that the more complex the environment, the greater the perils of complex control. The optimal response to a complex environment is often not a fully state-contingent rule. Rather, it is to simplify and streamline (Gigerenzer (2010)). In complex environments, decision rules based on one, or a few, good reasons can trump sophisticated alternatives. Less may be more.
In complex environments, tallying strategies have been found to be superior to risk-weighted alternatives. Take avalanche prediction. Avalanches are difficult to predict, as they are drawn from a fat-tailed (Power Law) distribution. Yet simple tallying of a small number of avalanche indicators has been found capable of predicting over 90% of historical accidents. It has also been found to be superior to more complex decision methods (McCammon and Hägeli (2007)).
During the 1990s, the bluntness of the risk judgements embodied in Basel I came increasingly to be questioned – and arbitraged. Basel I was perceived as lacking risk-sensitivity, at least by comparison with the new wave of credit and market risk models emerging at the time. Change came in 1996 with the Market Risk Amendment. This introduced the concept of the regulatory trading book and, for the first time, allowed banks to use internal models to calculate regulatory capital against market risk. ...With hindsight, a regulatory rubicon had been crossed. This was not so much the use of risk models as the blurring of the distinction between commercial and regulatory risk judgements. The acceptance of banks’ own models meant the baton had been passed. The regulatory backstop had been lifted, replaced by a complex, commercial judgement. The Basel regime became, if not self-regulating, then self-calibrating.Haldane ends the essay with some exploration of how we might reverse this trend, and manage to simplify regulations. I won't go into detail other than to second his suggestion that reducing the complexity of the financial system itself ought to be a principle target of such simplified regulations. But even before that, changing the mindset of financial economics is the first task. That mindset still remains fixated on the endless pursuit of optimal strategies by long calculations over risk weighted alternatives, when in reality we rarely know the risks or even the alternatives with much accuracy.
The ink was barely dry on Basel II when the financial crisis struck. This exposed gaping holes in the agreement. In the period since, the response has been to fill the largest of these gaps, with large upwards revisions to the calibration of the Basel framework. Agreement on this revised framework, Basel III, was reached in 2010. In line with historical trends the documents making up Basel III added up to 616 pages, almost double Basel II. ... The length of the Basel rulebook, if anything, understates its complexity. The move to internal models, and from broad asset classes to individual loan exposures, has resulted in a ballooning in the number of estimated risk weights. For a large, complex bank, this has meant a rise in the number of calculations required from single figures a generation ago to several million today (Haldane (2011)).
Taking all of this together, the parameter space of a large bank’s banking and trading books could easily run to several millions. These parameters are typically estimated from limited past samples. For example, a typical credit risk model might comprise 20-30 years of sample data – barely a crisis cycle. A market risk model might comprise less than five years of data – far less than a crisis cycle.
Viewed over an historical sweep, this pattern is even more striking. Contrast the legislative responses in the US to the two largest financial crises of the past century – the Great Depression and the Great Recession. The single most important legislative response to the Great Depression was the Glass-Steagall Act of 1933. Indeed, this may have been the single most influential piece of financial legislation of the 20th century. Yet it ran to a mere 37 pages. The legislative response to this time’s crisis, culminating in the Dodd-Frank Act of 2010, could not have been more different. On its own, the Act runs to 848 pages – more than 20 Glass-Steagalls. That is just the starting point. For implementation, Dodd-Frank requires an additional almost 400 pieces of detailed rule-making by a variety of US regulatory agencies.
An important consequence of thinking in this world of known risks and optimal solutions is that we end up accepting some dubious arguments that we can achieve the best of all possible worlds with the right pricing mechanism. With bad assumptions, in other words, we fall into the trap of believing the economists' standard models, when they really have little to do with the real world. If we break free of this illusion, and face up to living in a world with real uncertainty, we may return to an era in which old-style regulations and prohibitions against certain activities make perfect sense, and dreams of perfect pricing mechanisms become evident as the fantasies they are:
Over the past 30 years or so, the regulatory direction of travel has been towards pricing risk in the financial system, rather than prohibiting or restricting it. In the language of Weitzman, regulators have pursued price over quantity-based regulation (Weitzman (1974)). That makes sense when optimising in a risky world.
It may make less sense when optimising in an uncertain world. Quantity-based restrictions may be more robust to mis-calibration. Simple, quantity-based restrictions are the equivalent of a regulatory commandment: “Thou shalt not”. These are likely to be less fallible than: “Thou shalt provided the internal model is correct”. That is one reason why Glass-Steagall lasted for 60 years longer than Basel II. Quantity-based regulatory solutions have gained currency during the course of the crisis. In the US, the Volcker rule is a quantity-based regulatory commandment: “Thou shalt not engage in proprietary trading”. In the UK, the Independent (“Vickers”) Commission on Banking has also proposed structural, quantity-based reforms: “Thou shalt not co-mingle retail deposit-taking and investment banking”.
Yet even these notionally simple, structural proposals run some risk of backdoor complexity. For example, the consultation document accompanying Volcker already runs to 298 pages. Were these proposals to become mired in detail, they risk sinking, like the Tower of Basel, into the swamp. This is not because these proposals go too far but because they may not go far enough. These reform efforts have too many commas, semi-colons and sub-clauses. They would benefit from a few more full stops.
No comments:
Post a Comment