Another look at managing risk
Any sufficiently complex, tightly coupled system will fail sooner or later,” argues Charles Perrow, emeritus professor at Yale. Complexity makes it likely that some essential feature will be overlooked. Being tightly coupled means that the failure of one part will drag down the rest. In other words, according to Perrow, accidents are “normal.”
This combination characterizes financial systems and nuclear power plants, as Tim Harford recently pointed out in FT. To better understand the problem – and what can be done about it – he visited the Hinkley Point B nuclear power plant in the UK, and talked with the engineers. With their help, he offered a graphic account of the meltdown at Three Mile Island.
“More than 100 alarms were filling the control room with an unholy noise. The control panels were baffling: they displayed almost 750 lights, each with letter codes, some near the relevant flip switch and some far. Red lights indicated open valves or active equipment; green indicated closed valves or inactive equipment. But since some of the lights were typically green and others were normally red, it was impossible even for highly trained operators to scan the winking mass of lights and immediately spot trouble.”
Harford asked the head of nuclear installation safety at the International Atomic Energy Agency what was learned from the Three Mile Island meltdown. “When you look at the way the accident happened,” he replied, “the people who were operating the plant, they were absolutely, completely lost.”
Similarly with the 2008 financial meltdown, all the signals and alarms went off at once. The complex derivatives that many did not understand collapsed in value as mortgage defaults increased, the insurance taken out against losses was over-extended, the shallow assurances of the ratings agencies turned out to be unreliable, the assumptions of economists about self-correcting markets proved false, and financial models did not take into account the interconnectedness of events or the vast extent of the shadow banking system. (See, “What We Can Learn From a Nuclear Reactor.”)
These all were failures of understanding, human failings not mechanical ones. As at Three Mile Island, there were indicators and models that risk managers checked out, systems they set up and believed in. But in many cases the confidence was misplaced, the complexity of the system poorly understood. They had constructed a Rube Goldberg machine, too busy cranking out the profits or the power – too willing to take for granted that the checks in the system worked. They were not mindful of the whole.
Harford points out the need to simplify the system, decouple it, or reduce the consequences of failure. Information has to be organized to be useful. Or the elements of the system have to be isolated, smaller. If accidents are normal, we have to be more aware of what is happening or minimize the danger.
The point is gthat we have to think about failure, the inevitability of accidents. If we could get used to the idea that accidents are “normal,” we might be able to take a hard look at how they are going to occur, and not keep fooling ourselves into thinking they won’t.