It is a tragic irony that the day before the 2010 Deepwater Horizon oil well disaster, executives from BP and the rig’s operators, Transocean, visited the platform on a “management tour” which included a number of specific safety-related purposes.
During the tour, there were already signs that all was not right on the ocean floor. The drilling of the well had been completed and the rig was getting ready to move. Staff were cementing inside the pipe to stabilise the well head. The VIPs were told personnel were having trouble getting the pipes lined up to test the effectiveness of the seal but it “was no big deal”. Accepting the situation was under control, the visiting executives did not query how the problem was being solved or how personnel were monitoring the flows. Had any of the right questions been asked or had the VIPs (all of whom had worked on oil rigs and had a detailed knowledge of drilling operations) intervened at this stage to ensure the right monitoring procedures were being implemented, the disaster could have been averted. They didn’t, and the next day an explosion blew up the platform, killing 11 crewmen and resulting in millions of barrels of crude oil being spilled into the Gulf of Mexico.
The management tour was a regular event with a well-planned agenda, and on this occasion, all the boxes had been ticked; the visiting executives had followed procedure to the letter of the law. However, by not following up on the initial signs of trouble, they failed to meet the spirit of the safety procedures they were there to assess. Had they shown better judgment and delved further when told there was a problem, the explosion may never have occurred.
This is a typical case of procedure not going far enough.
Procedures – rigid regulations or room to manoeuvre?
Procedures are there to link employees’ actions to a company’s strategic vision. They allow management to guide operations without constant supervision and provide staff with a clear plan of action to implement a policy. In general, procedures are carefully thought out directions based on in-depth knowledge of a company’s functions, operational experience and an understanding of the inter-connections between different business units.
They help employees learn. However, if procedures are treated as tick-box requirements (as was the case on the Deepwater Horizon rig), if they stop employees from thinking, or are followed blindly without an understanding of why they are in place, then there’s a problem.
While policies are cast in stone; procedures are very rigid guidelines for which exemptions may be made – although this often requires going through another set of strict procedures.
Which raises the question: should employees be required to follow all procedures indiscriminately or is there a time when judgment should allow for procedures to be overruled without prior approval?
Take another energy-related example, Piper Alpha.
When procedure conflicts with policy compliance
Piper Alpha was a North Sea oil and gas production platform operated by Occidental Petroleum. Late evening in July 1988, a communication breakdown led to an unsealed pipe being turned on; the gas leak resulted in an explosion. Following procedure, the custodian pressed the emergency stop button to close down oil and gas extraction. The fire from the explosion would have burnt out had there not been a pipeline feeding gas into the rig from a platform 1.5km away. Company procedure insisted that closing the valve to stop the flow from the other platforms required approval from Occidental’s onshore control centre. Attempts to contact the centre failed and, unwilling to risk the ramifications of being caught doing the wrong thing, the platform manager kept the pipeline open. As gas continued to flow in, the fire raged, setting off more explosions ultimately destroying the platform, killing 167 people.
The lesson from Piper Alpha is that sometimes, to meet the spirit of a company’s policies, procedure should be ignored. The trick is knowing when.
In most cases, steps are written into procedures to address emergency situations. However, in exceptional situations, when the letter of the law works against the spirit of the policy, personnel should feel free to override it; to make a judgment call knowing that while any overriding of procedure will be taken to trial after the event, adhering to company policy could pose a serious safety risk.
This introduces another dilemma. When you act outside of regulations you are often blind to the unexpected impact further down the food chain. Actions which may seem correct from where one person stands may have disastrous reaction later on.
This was the case of the American Airlines Flight 191 which crashed moments after leaving O’Hare International Airport in 1979, killing all 271 passengers and crew. Investigators traced the crash back to a detached engine, which had been damaged during a maintenance overhaul eight weeks earlier. The maintenance procedure recommended by the engine’s manufacturers required that it be removed from a pylon holding the rigging onto the wing. However, American Airlines (along with Continental Airlines and United Airlines) had overridden this procedure and set out their own guidelines, removing the engine and pylon as a single unit – saving 200 man-hours per aircraft. Crash investigators found a number of other DC-10s had also been damaged during the same faulty maintenance procedure.
“You can’t replace procedure with good judgment”
The case of Flight 191 is evidence that flouting procedures should not be done lightly and, while you can never replace good judgment with procedure; you can’t systematically replace procedure with good judgment.
Anyone looking to override procedure needs to understand the reasoning and purpose behind it and the consequences of any change. Generally procedural exemptions should only be undertaken after rigid assessment and due process. Often the information which prompted the request can be used to update the existing regulations.
What is obvious though is no matter how up-to-date the procedure, you can’t train or engineer out every eventuality. There will always be some element of risk and there will always be times when people find they have to override the letter of procedure to act in the spirit of policy compliance. It is up to organisations to ensure that personnel have a comprehensive understanding of the working of the company when they make these decisions, and, when violations of procedures are investigated, they are well-versed enough to defend themselves in a fair trial. Companies should also take a look at their own structure and culture when it comes to identifying flawed procedures, giving employees the opportunity to flag a problem at a high level before disaster happens.
Loïc Sadoulet is an Affiliate Professor of Economics at INSEAD.
Thomas Hinterseer is Managing Director at CEDEP.
Leave a Comment