Skip to content ↓

Oil spills, safety and system design and management

Interdisciplinary thesis explores complexity of safety in organizations with a culture of blame and ways to address this
Adam Soro, a student in MIT's System Design and Management Program who will graduate in June, 2010, explored an emerging concept in systems safety, the "just culture".
Caption:
Adam Soro, a student in MIT's System Design and Management Program who will graduate in June, 2010, explored an emerging concept in systems safety, the "just culture".
Credits:
©Kathy Tarantola Photography

As oil continues to gush into the Gulf of Mexico from the recent catastrophic BP blowout, MIT System Design and Management student Adama Soro ’09 can only shake his head.

“I’m sure it could have been prevented,” he says. “Safety technologies must advance with production technologies. Usually, they don’t. In this case, they didn’t. Engineers were relying on containment techniques for shallow waters hoping that they would work in deep waters without any serious test at such depths.”

Indeed, the containment dome BP deployed to cap the mile-deep leak was constructed after the blowout. “It’s like building the fire truck when your house is on fire,” oil spill consultant Dr. Rick Steiner told The Boston Globe.

The problem, Soro says, is that safety is not integrated enough into systems during design because it is deemed costly; it is sometimes appended afterward. In the case of Deepwater Horizon, none of the safety systems in place worked: be it the rubber seals, the blowout preventer or its back-up system called the Deadman. Those technologies were not adapted for deep waters, a real challenge for oil companies’ research and development organizations.

System safety is the field that Soro chose to investigate further at SDM, which resides within the MIT Engineering Systems Division. He contends that information flow is instrumental for improving safety in organizations. However, the existing culture in most organizations that systematically assigns blame to people involved in accidents is a serious impediment to building an effective safety reporting system. This dysfunctional culture is the subject of Soro’s SDM thesis, “Assessment of an Emerging Concept in System Safety: The Just Culture.”

The “Who messed up?” investigative approach derives from the traditional "culture of blame,” Soro says. Instead, he contends that the emerging concept in system safety called the “just culture,” in which individuals are not punished for their mistakes (unless they intentionally violate the rules or they are responsible for gross negligence), can help create the atmosphere of trust needed for people to willingly report their mistakes and near misses. In a word, a just culture improves both the quality and quantity of information flowing to management and thereby provides it with the tools to improve both systems and safety. As Soro’s advisor on his thesis work, Nancy Leveson, professor of aeronautics and astronautics and engineering systems, says, “Blame is the enemy of safety.”

“It’s easier and less expensive to change people than to change systems,” says Soro. “Mistakes happen sometimes because of production pressure, people are asked to do things too fast and then they stop following rules. There is a trade-off between safety and production. Also, changing people without improving systems only increases the chance for error.”

Soro has seen this dismal dynamic at first hand. After graduating from the Polytechnic Institute of Cote d’Ivoire with an SM in electrical engineering, he was hired by the Ivorian Electric Company and placed in charge of quality management. There, Soro saw that when accidents occurred management did not want to hear about flaws in its systems. After all, they had been working well up to the moment of the accident (as was BP’s oil drilling platform). The problem, obviously, could not be with the system; it had to be with the operator, so management wanted to know who to discipline. Naturally, workers in the field were loath to report near-accidents, fearful (with good reason) that doing so would jeopardize their jobs. Consequently, the information that management received about incidents was incomplete and often untruthful, making it impossible to fix the real problem.

“You need to know about a problem before you can fix it,” Soro says. “If people don’t talk, accidents will happen.”

At the Ivorian Electrical Company, Soro had neither the experience nor the authority to change the “blame culture.” That experience, and his subsequent work managing the construction of electrical substations for the UN delegation sent to Cote d’Ivoire during its political crisis, led Soro to MIT’s System Design and Management program to improve his project management and system design skills.

Moreover at the UN, says Soro, “I was always over budget and missed deadlines. Now, with the tools and understanding I’ve acquired at SDM, I’m better prepared to go back to the workplace.

“I believe change is possible,” Soro continues. “We can improve systems and their safety by improving our mindset and our workplaces.

“And next week,” he concludes, “I have a job interview with an oil company and I will offer some guidelines on how to improve safety in high-risk environments.”
 

Related Links

Related Topics

More MIT News