Speech at Institute of Operational Risk annual dinner 2010

Stationer’s Hall, London, 15 April 2010

Speech proposing the health of the guests by John Thirlwell

Some reflections on operational risk and the crisis

In proposing the health of our guests, I thought I’d take the opportunity to give you some reflections on fourteen years’ involvement with operational risk. I thought that might make me one of the oldest in the game but, as I’ve talked to others in the room, I find that a few of you got there before me.

Not that fourteen years is a long time in the history of operational risk. If you go back to 1345, you will find Thomas Aquinas, then a philosopher, but later a saint (which is an interesting kind of bonus to be offered), writing that ‘The world has never been more full of risk’. And of course the risks he was talking about were the same as those which caused many mediaeval bankers to go bust. They suffered from lending to defaulting sovereigns and states – sovereign risk is nothing new – but the real threats were war, the Black Death and other plagues, famine, all of which wiped out communities and so the source of the banks’ capital. You will see that they were all operational risks.

And even in my time we’ve had major operational risk events such as 9/11 and other terrorist attacks, threatened pandemics such as SARS and natural disasters like Hurricane Katrina.

And of course – the crisis. The banking crisis, because that’s what it was. I’m impressed by the way in which bankers have managed to get people to talk about the global economic crisis or the global financial crisis. No, it was a banking crisis. And banking crises will always be with us. Christopher Fildes, in one of his delightful columns in the ‘Spectator’ once wrote, ‘Bankers are remarkably adept at finding ever more ingenious ways to lose money – preferably other people’s’. The skill is for the authorities – and bankers – to make that less likely.

So, as a good operational risk manager, let me look at the causes. Because the essence of good operational risk management is about cause – not effect or event, but cause.

Of course, as you well know, the cause of the crisis was China, building up huge surpluses, investing them in Uncle Sam’s Treasury Bonds, driving down yields of government bonds and everything else. So where was a banker to make a crust?

A better answer to the question of what caused the banking crisis might be, simply, bankers. But that’s a bit like saying that burglaries are caused by burglars or pickpocketing by pickpockets. The fact is that they need the opportunity. As John Gapper has written in today’s FT, ‘too many people were encouraged by politicians and left to it by regulators’. To which list I would add the central banks, who don’t seem to have been fingered at all in the search for culprits. Perhaps somebody should be writing a PhD thesis on ‘The economic effects of Alan Greenspan’. Perhaps they are.

So the authorities must take a share of the blame. But what about the banks themselves, and the causes there? Credit risk. Certainly there was a property and asset bubble. If you go back over the last 200 years, you will see that all banking crises are preceded by an asset, almost always a property, bubble. Not that all property bubbles lead to banking crises, but it’s a pretty good KRI, to use an operational risk term.

People say the instruments were too complex. I’m not so sure that’s true - if people had bothered to read them. When I was Chief Risk Officer at Hill Samuel and we were trading bonds, I insisted that the traders read the bonds. And it was interesting how often they decided not to trade when they realised what they were really dealing with. After the crisis had got under way, Gerry Corrigan, former President of the New York Fed and now a leading light of the influential Group of Thirty, took some of the alphabet soup-type instruments home one weekend to read. His conclusion was that the documentation might be thick and tedious, but if you read it you could clearly see where the risks lay. They weren’t that complex – if you bothered to read them. Credit risk? Or operational risk?

But of course banks outsourced their credit risk assessment to the rating agencies. Apart from the issues with the rating agencies, which I won’t go into, that was to me a clear dereliction of duty.

And liquidity risk. There was no money. Perhaps the only thing I can say in banks’ favour is that it was not unreasonable, in stress tests, to look at three days, three weeks, even three months, without money. But test for the possibility of no money for the thick end of six months? Probably not. But it did reflect a breakdown of trust between banks. Again, in my view, as much an operational risk as a market or credit risk.

Overall, the crisis was rightly termed a failure of risk management. But I believe it was fundamentally a failure of operational risk management. Which is why, apart from anything else, we need an Institute of Operational Risk to promote good practice. Good operational risk management is about instilling proper risk behaviours. And people risk issues and behaviours – which lie at the heart of operational risk – were at the heart of the crisis.

I have mentioned failure to understand risk and breakdown of trust, but let’s take governance. Where was the challenge? Where was risk management sitting? At the top table – or some tables below?

I have sympathy with the reported statement of the IMF yesterday that despite all the initiatives – on capital, corporate governance, cross-border supervision, crisis management, remuneration, derivatives, hedge funds, credit rating agencies and so on – they do not think that they will prevent another financial crisis. I agree with that, because so much of this and other crises is about people risk and human behaviours.

Greed? Perhaps. But, I prefer to cite the abandonment of common sense and herd instinct. Herd instinct lies at the bottom of all bubbles and banking crises. John Maynard Keynes, in the early 1930’s, offered a reason, ‘In the face of uncertainty, how do we behave in a manner which saves our faces as rational economic men?’ It’s a bit like ‘nobody got fired for hiring IBM’. The problem is that although uncertainty is everywhere, we have a strong impulse to believe otherwise and so we seek refuge in behaving as our peer group does. As Aristotle put it in his ‘Poetics’, ‘Imitation comes naturally to human beings from childhood’. Chuck Prince was an unwitting disciple of Aristotle, given his comment in July 2007, ‘When the music stops in terms of liquidity, things will be complicated. But as long as the music is playing, you’ve got to get up and dance. We’re still dancing.’ What an utterly irresponsible thing to say. Not that herd instinct is inevitably wrong, but lemmings might tell you different.

The fact is that collective beliefs usually prevail over rational, considered judgements. ‘The madness of crowds’, as John Gapper put it in his article today. Bankers may not have understood, or bothered to understand, the transactions they were undertaking, but they took reassurance that everybody else was doing it.

And of course humans are fundamentally optimistic animals. Not an irrational sentiment, but certainly a non-rational one and an important component of people risk.

I’m not going to go into a long disquisition on heuristics and behavioural science, but one story may be worth telling to indicate the individuality of people’s risk assessment. It appears in Peter Bernstein’s excellent book about risk, ‘Against the Gods’. One night, during the German bombardment of Moscow in the Second World War, an eminent professor of statistics arrived at an air-raid shelter. Those inside asked why he was there. After all hadn’t he often said, ‘There are seven million people in Moscow. Why should I believe they’ll get me?’ to explain why he didn’t bother using the shelters. So why tonight? ‘There are seven million people in Moscow - and one elephant’, he explained. ‘Tonight, they got the elephant.’

I have drifted from the crisis to people risk and behaviours because for me that is hugely what operational risk is about and is one of its biggest challenges. Human beings are one of the greatest problems, if you believe that operational risk is about measurement and capital calibration. I argued long ago at the British Bankers’ Association that operational risk should have been in Pillar 2 of Basel, but that was a battle which was not going to be won, so we quickly switched to other things and operational risk found itself being subject to the calibrations of Pillar 1.

That treats operational risk as a science. If it’s a science at all, then it’s a social science, like economics, not a physical or mathematical science. I have long thought that the main reason why economics fails in many of its theories is because economists so often ignore those economically irrational things called human beings.

In his excellent and slim little book, ‘Money’, which came out late last year, Eric Lonergan draws attention to Friedrich von Hayek’s speech accepting the Nobel Prize for economics in 1974. He says it should be required reading for all risk managers – and I agree. In it Hayek said, ‘Unlike the position that exists in the physical sciences, in economics and other disciplines that deal with essentially complex phenomena, [which for me includes operational risk] the aspects of the events to be accounted for about which we can get quantitative data are necessarily limited and may not include the important ones.’

And it’s not just people who can make our lives difficult in operational risk. In correspondence between the philosopher Gottfried von Leibnitz and Jacob Bernoulli, a Swiss mathematician, in 1703, Bernoulli states that there are three conditions which have to be met if we are to be able to make satisfactory mathematical evaluations: full information, independent trials and relevance of the information we are dealing with. Further on in the correspondence, however, Leibnitz goes on to observe that ‘A finite number of experiments will always be too small a sample for an exact calculation of nature’s intentions’. Nature’s intentions are frequently the subject of operational risk.

And you can go further back, to sixteenth century Spain, where the schoolmen, when talking a bout commodity pricing – I told you there was nothing new under the sun – said that ‘the mathematical price depended on so many particular circumstances that it could never be known to man but was known only to God’. Perhaps that’s what Lloyd Blankfein had in mind last November when he said that Goldman was doing God’s work.

This general view about the limitations of the information we have with which to make accurate evaluations, was also shared by one of the fathers of mathematical economics, Vilfredo Pareto, he of 80/20 fame. He’s one of my heroes because I’ve always thought that 80/20 was a more realistic mathematical ratio in operational risk than the 99.9% some people are asked to deal with. Especially when it comes to cause, the real thing we should be attempting to evaluate and assess in operational risk.

And one other mantra, which I’ve heard continuously in operational risk over the years, ‘We can’t manage what we can’t measure’. Hayek again. ‘Often that is treated important which happens to be accessible to measurement . . .such a demand quite arbitrarily limits the facts which are to be admitted as possible causes of the events which occur in the real world.’ As Eric Lonergan adds, ‘So much of what interests us is, in fact, immeasurable.’ Especially when it comes to cause, which is the driver of operational risk management.

It could be said that Hayek’s speech, remarkably given the occasion on which it was delivered, is saying that economics is all smoke and mirrors. His big message, though, is that we must be very much aware of the limitations of calculation when it comes to the social sciences. We must approach the task with real humility.

That applies hugely to operational risk which, as I’ve said, is a social science at best and is about business risk in its widest sense, with all the myriad variables that entails. It is about managing people and their behaviours and managing unexpected things. And it’s about evaluating opportunities as well as risk.

The crisis was, in my view, about human behaviours and as such was a crisis of operational risk. It was about culture and governance, governance being the institutionalisation of culture. On the subject of culture, I’m delighted to be leading the challenging task of drafting of the Institute’s good practice guidance on governance and culture. I am myself guided by the words of Mervyn King. No, not the one just down the road, but the Professor who chairs South Africa’s King Committee on corporate governance. He once said, ‘If you get buy-in you can achieve extraordinary things. But if you don’t get buy-in, you won’t even achieve the ordinary. It’s alright to talk about the tone from the top, but I like to think about the tune in the middle.’ Getting that tune right is what operational risk managers should be seeking to develop.

So, guests of the Institute, this is our world. In his book, ‘Organized Uncertainty’, my good friend Mike Power, at the London School of Economics, states that operational risk is ‘an attempt to frame the unframeable, to assuage fears about the uncontrollable, ‘rogue others’ and to tame the man-made monsters [of the financial system]’. How very true. It makes it sound like a rather glorious computer game which, dear guests, is what we spend our time doing!

Operational risk is a slippery beast. And as I could hardly leave without a word from Shakespeare, I shall just say that, for me, it is like Cleopatra, whom Enobarbus celebrated for her ‘infinite variety’. That’s why, after all these years, I continue to find it so endlessly fascinating.

So, guests, welcome to our world. Welcome to our tables. Members of the Institute, would you please join me as I propose a toast to the health of our guests.

References:

Aristotle, Poetics, chapter IV.

Peter L. Bernstein, Against the Gods: the remarkable story of risk (New York: John Wiley & Sons Inc, 1996, 1998).

John Gapper, A short story of a star hedge fund, Financial Times, 15 April 2010.

Friedrich von Hayek, The Pretence of Knowledge, Nobel Prize acceptance speech, 1974.

Eric Lonergan, Money (Durham: Acumen, 2009).

Michael Power, Organized Uncertainty: designing a world of risk management (Oxford: Oxford University Press, 2007).


© John Thirlwell 2010. All rights reserved.
Any reuse in whole or part requires our consent
Design by www.hootaccessories.com