In February of 2011 I wrote a post titled “… from ‘Piper Alpha’ to ‘Deepwater Horizon’, do we really learn?“. It was part of an ongoing series relating to Culture and Resilience. This is also a theme in my recent article “A Fork in the Road”.
The sad part seems to be that we do not learn, and worse, continue to blindly embrace and build on the thinking that caused the problem in the first place.
Let me briefly review the journey in the earlier post;
- Piper Alpha – UK, 1988, North Sea Oil Platform. Inquiry noted that;
- senior management were too easily satisfied and relied on “the absence of any problems as indicating that all was well”.
- the operators tolerated “known problems” and management demonstrated only a “superficial response when issues of safety were raised by others”.
- Longford – Australia, 1998, Gas Plant. The safety regime promoted following Piper Alpha was in place at Longford. The company blamed the operator on duty, but the Inquiry report;
- blamed the company (Esso) for not ensuring its staff knew the risks they faced and the correct procedures for dealing with them.
- found that as a result of ‘efficiency’ measures and cost cutting there was minimal staffing in the control room, this coupled with reductions in maintenance meant that it became ‘normal’ to have alerts go unattended for some time. Safety was a secondary concern to throughput, and profit.
- Texas City – USA, 2005, Oil Refinery. Operated by BP, same as Deepwater Horizon, not learning from experience. Federal inquiry found;
- The operator had not maintained the plant well and had ignored prior incidents and reports about safety.
- That BP was more concerned with occupational safety than process safety.
The management process (and incentives) encouraged a focus on measuring falls and driving accidents – while ignoring risks and hazards around the core processes and equipment of the refinery.
They embraced a ‘management systems’ approach, process rather effective capability is the focus. I know I have seen that somewhere else, have you?
How well did that work for them?
Deepwater Horizon, Gulf of Mexico, 2010.
The US National Commission on the BP Deepwater Horizon Oil Spill and Offshore Drilling report found that BP, Halliburton, and Transocean had attempted to work more cheaply and thus helped to trigger the explosion and ensuing leakage.
“whether purposeful or not, many of the decisions that BP, Halliburton, and Transocean made that increased the risk of the Macondo blowout clearly saved those companies significant time (and money).”
The inquiry concluded that the incident was avoidable and that “it resulted from clear mistakes made in the first instance by BP, Halliburton and Transocean, and by government officials who, relying too much on industry’s assertions of the safety of their operations, failed to create and apply a program of regulatory oversight that would have properly minimized the risk of deepwater drilling.” They also found that the government regulators did not have sufficient knowledge or authority to notice these cost-cutting decisions.
Did not have the knowledge and authority or just did not want to see and blow the whistle? When I am coaching sports officials this is often about courage as much as knowledge.
A few weeks after I wrote that post an earthquake and tsunami hit Japan. The report into that incident should be generating a lot more discussion than it has.
Fukushima – Japan, 2011, Nuclear Power Plant
The investigating committee concluded that the accident was clearly “manmade.”
“We believe that the root causes were the organizational and regulatory systems that supported faulty rationales for decisions and actions, rather than issues relating to the competency of any specific individual.”
More importantly, it highlighted the need to address changes in organisational culture and identified “regulatory capture” that effectively meant that compliance with the regulatory rules was voluntary.
“Its fundamental causes are to be found in the ingrained conventions of Japanese culture: our reflexive obedience; our reluctance to question authority; our devotion to ‘sticking with the program’; our groupism; and our insularity .”
Unfortunately these are not unique attributes of Japanese life, but are increasingly common in Western corporate and government organisations.
In my recently published paper, A Fork in the Road, I talk about the need to consider “Double Loop Learning” – which I first blogged about in March 2011.
To really learn, we need to have the capability, and the courage, to recognise and challenge our norms and biases. To overcome the Wilful Blindness that seems to afflict too many, at both the individual and organisational level.
What would have to happen in your world to challenge the core assumptions, rather than learning being defined as just fine tune the strategy?
Do Risk and BC practitioners need to learn more about what organisational culture actually is before thinking they can influence it?
Leave a Reply