Based on their reading and research from the resources, students continue in their role as the Chief Technology & Risk Manager of an oil company such as BP that just experienced the worst explosion, loss of life and materials in oil drilling history. They should prepare for a press conference with a group of bloggers. They should complete the outline notes below—add more, if they like—to help prepare responses for the bloggers.
1. Key issues of potential technological failure in this explosion
Key issues include: 1) the formula of the cement used; 2) the deviation from standard performance testing practice that allowed drillers to use a version of cement formula that was never tested to see if it could really do the job of forming a stable plug; 3) insufficient training to detect warning signs of a defective element. Summary: bad cement, bad test, bad bosses and workers in charge of performance testing; faulty device known as a blowout preventer.
2. The main areas of human error
Poor management and communication; deviation from standard performance testing procedures; inattention to warning signs; lack of a comprehensive disaster-mitigation plan; lax quality-assurance and quality-control practices.
3. Engineers design to prevent system failures. In this case, several technological safety features were designed in to protect life and property. They include:
One feature was the blowout preventer, which appeared to be faulty.
4. What is your response to this quote that sums up the situation: “William Reilly, the other co-chairman of the commission, concluded that at heart it wasn't a technical problem, it was a human failing—an assumption that the problems involved in sinking this well through a mile of water and two-and-a-half miles of rock were all routine and manageable.”
The phrase “technological hubris” is often used in these situations to suggest this error in thinking that humans have overcome the tendency to make errors with the technology they’ve made. In fact, because technology is a human creation, it is subject to the same human weaknesses—inattention, wrong assumptions, misunderstanding, erroneous calculations, mixed up communication. One way technologists and scientists overcome error to perfect systems is through coordinated group work following strict rules called protocols, and by constantly testing the performance of both the protocols and all devices—and testing the people trained to interact with technology. People can be “defective” parts in a system. Think of your chemistry lab. The teacher doesn’t just send you off with a beaker full of solution and colored salts and say, “Mix it up any old way!” No, protocol rules the practice of science, technology, engineering, and medicine. Do you want a surgeon just cutting you open and poking around at whatever looks interesting? Too bad if you do—protocol forbids it. And just as important, the surgeons’ peers prevent it. In science, technology, medicine, and engineering, the potential for life-threatening error is so high that they have evolved a strict culture of responsibility for all practitioners of the craft. It’s one reason their schooling is so long. If, in the operating room, a surgeon breaks from protocol and starts randomly poking about, all other surgeons in the operating room would immediately halt the operation as life-endangering to the patient on grounds of deviating from protocol. That’s performance testing and protocol.
5. Performance testing is a standard operating procedure of quality assurance and quality control. In this case, questions about the adequacy of performance testing are raised in the case of cement use. Briefly describe why cement is an issue.
The cement was important because it was meant to seal explosive natural gas out of the doomed oil well. It failed to work as intended in this case because Halliburton Co. never fully tested the cement. Indeed, investigators said Halliburton, the biggest cement contractor in the world, went ahead with the cement job despite tests indicating it might fail.