“Errors in execution work are immediately visible. Errors in decision-making are more difficult to see. So organizations are biased to blame the doers, not the deciders that put them into that situation.” Do you have an example or story where you’ve seen this?
Post copied from David Marquet https://www.linkedin.com/posts/davidmarquet_errors-in-execution-work-are-immediately-activity-7229135911001767937-9LEw?utm_source=share&utm_medium=member_desktop in order to test the Nuclear Safety Virtual Adviser AI chatbot
One example that illustrates the concept of organizations being biased to blame the doers rather than the deciders can be found in the context of the nuclear power industry. A notable case is the Fukushima Daiichi nuclear disaster that occurred in Japan in 2011.
Background
The disaster was triggered by a massive earthquake and tsunami, leading to the failure of the cooling systems at the Fukushima Daiichi nuclear power plant. This resulted in the release of radioactive materials and a major nuclear accident.
Execution vs. Decision-Making
- Execution Errors: The immediate response to the disaster involved operational personnel executing emergency procedures. Some initial responses, such as manual venting and handling equipment, were criticized as being slow or inadequate.
- Decision-Making Errors: The disaster’s root causes can be traced back to higher-level decision-making. Prior to the disaster, there were critical oversights in the plant’s design, regulatory decisions, and risk assessments. For example:
- Design Decisions: The plant was not adequately prepared for such a large tsunami, despite historical evidence and expert warnings that such an event was possible.
- Regulatory Decisions: There were failures in regulatory oversight, where safety improvements and risk assessments were not sufficiently updated to reflect new information about seismic risks.
- Cultural Issues: A culture that discouraged questioning and delayed proactive measures contributed to a lack of preparedness.
Blame Allocation
After the disaster, much of the blame was initially directed toward the plant operators for their execution errors. However, further investigations revealed that critical failures were due to higher-level decision-making processes that failed to anticipate and mitigate risks adequately.
Lessons Learned
- Risk Management: This case highlights the importance of robust risk management and the need for decision-makers to anticipate worst-case scenarios and implement preventive measures.
- Safety Culture: It emphasizes the necessity of fostering a safety culture where questioning and proactive risk assessments are encouraged and acted upon.
- Regulatory Oversight: Strong and adaptive regulatory oversight is crucial to ensure that safety measures keep pace with evolving risks.
In summary, the Fukushima disaster serves as a poignant example of how decision-making failures can lead to catastrophic outcomes and how organizational biases can initially obscure these deeper issues, focusing blame on operational execution instead.
You said:
Do you have another example?
ChatGPT said:
Another example that illustrates the tendency of organizations to blame doers rather than deciders can be found in the aviation industry, specifically in the case of the Boeing 737 MAX crashes.
Background
The Boeing 737 MAX aircraft was involved in two fatal crashes: Lion Air Flight 610 in October 2018 and Ethiopian Airlines Flight 302 in March 2019. These crashes led to the grounding of the 737 MAX fleet worldwide.
Execution vs. Decision-Making
- Execution Errors: Initially, the pilots of both flights were scrutinized for their handling of the aircraft during the emergencies. The Maneuvering Characteristics Augmentation System (MCAS), a new software feature designed to improve handling, activated erroneously, leading to a loss of control.
- Decision-Making Errors:
- Design Decisions: Boeing’s decision to implement the MCAS without adequate pilot training or notification was a significant factor. The system relied on a single angle of attack sensor, making it vulnerable to erroneous data.
- Regulatory Decisions: The Federal Aviation Administration (FAA) delegated much of the safety assessment process to Boeing, leading to insufficient independent evaluation of the MCAS.
- Organizational Culture: Pressure to compete with rival aircraft manufacturers led to decisions prioritizing cost and time efficiency over safety.
Blame Allocation
Initially, much of the blame was placed on the pilots for not managing the system failure effectively. However, investigations revealed that the root cause was more closely linked to design and regulatory failures. The decision-making process within Boeing, under significant pressure to meet deadlines and cut costs, led to compromises in safety that were not apparent until after the crashes.
Lessons Learned
- System Design and Testing: This case underscores the need for rigorous testing and transparent communication about new systems and their potential impacts on aircraft operations.
- Regulatory Oversight: It highlights the necessity for robust, independent regulatory oversight to ensure that safety is not compromised by commercial pressures.
- Organizational Culture: The importance of a safety-oriented culture that prioritizes risk management and comprehensive training for new technologies is emphasized.
In summary, the Boeing 737 MAX crashes serve as a compelling example of how deeper systemic issues in decision-making and organizational priorities can lead to catastrophic outcomes, with initial blame often misdirected toward those executing the decisions rather than those making them.
You said:
Please give me another example
ChatGPT said:
Another illustrative example of organizations blaming the doers instead of the deciders is the Deepwater Horizon oil spill in 2010.
Background
The Deepwater Horizon oil rig, operated by BP, experienced a catastrophic blowout, leading to one of the worst environmental disasters in history. The spill released millions of barrels of oil into the Gulf of Mexico over a period of several months.
Execution vs. Decision-Making
- Execution Errors: Immediately after the incident, there was a focus on the operational errors of the rig workers. Initial attention was given to actions such as the misinterpretation of pressure test results and failure to detect the blowout preventer’s malfunction.
- Decision-Making Errors:
- Design and Planning Decisions: Investigations revealed that there were significant issues with the well design and decision-making regarding cost-cutting measures. BP chose to use fewer centralizers than recommended, increasing the risk of cement failure.
- Regulatory Oversight: Regulatory bodies had insufficient oversight and relied heavily on self-regulation by oil companies, leading to gaps in safety evaluations and risk assessments.
- Corporate Culture: There was a culture of prioritizing cost efficiency and speed over safety, which influenced many of the decisions made in the planning and execution phases of the drilling operation.
Blame Allocation
Initially, blame was directed at the rig operators for the immediate technical and operational failures. However, further investigations highlighted that higher-level decisions regarding well design, risk management, and corporate practices were significant contributors to the disaster.
Lessons Learned
- Risk Management and Safety Culture: This case emphasizes the critical need for a strong safety culture that prioritizes risk management over cost savings and production timelines.
- Regulatory Oversight: It underscores the importance of robust regulatory oversight and the need for independent audits and checks to prevent companies from compromising safety for efficiency.
- Corporate Accountability: It highlights the need for corporations to take responsibility for the broader organizational and decision-making processes that impact safety and risk management.
In summary, the Deepwater Horizon incident serves as a powerful example of how systemic issues in decision-making and organizational priorities can lead to disasters, with initial blame often focused on execution rather than the underlying decisions that create such vulnerabilities.
You said:
Why are organizations so quick in assigning blame to workers rather than acknowledging failures in the processes of the management system and in the decision-making? Is it because the root cause analysis takes more time or because we want to find somebody to blame just to have the illusion that we are in control?
ChatGPT said:
The tendency for organizations to blame workers rather than acknowledging failures in processes and decision-making is influenced by several psychological, cultural, and structural factors:
1. Simplicity and Control
- Immediate Resolution: Blaming individuals provides an immediate and seemingly clear resolution to a problem. It simplifies complex issues by attributing them to human error, giving the illusion of control over the situation.
- Illusion of Control: Assigning blame to workers can create the perception that the problem has been identified and addressed, even if the underlying systemic issues remain unresolved.
2. Cultural Factors
- Blame Culture: In some organizations, there is a deeply ingrained blame culture where individuals are held accountable for failures, discouraging the acknowledgment of systemic issues. This culture can discourage transparency and open communication about mistakes or failures.
- Avoidance of Accountability: Management and leadership might be reluctant to acknowledge systemic failures as it could reflect poorly on their decision-making and strategic oversight, leading to an emphasis on blaming front-line workers instead.
3. Psychological Factors
- Cognitive Biases: Human nature tends to favor simple explanations for complex problems. Cognitive biases such as the fundamental attribution error can lead to a preference for blaming individuals rather than recognizing systemic or situational factors.
- Scapegoating: Blaming individuals can serve as a psychological defense mechanism, redirecting attention away from organizational weaknesses and protecting the status quo.
4. Structural Factors
- Time and Resources: Conducting a thorough root cause analysis requires time, effort, and resources. It involves delving into complex processes, decision-making chains, and organizational structures, which can be more challenging than blaming an individual.
- Hierarchy and Power Dynamics: In hierarchical organizations, those at the top may be shielded from blame due to their position and influence, while those lower in the hierarchy may become convenient targets for blame.
5. Legal and Financial Considerations
- Liability and Legal Implications: Assigning blame to individuals might be seen as a way to mitigate organizational liability and legal repercussions, as systemic failures might expose the organization to greater legal and financial risks.
- Reputation Management: Blaming individuals can be a way to protect the organization’s reputation by presenting the issue as an isolated incident rather than a systemic problem.
Addressing the Issue
To move away from a blame-oriented approach, organizations can:
- Promote a Learning Culture: Encourage a culture that focuses on learning from mistakes rather than assigning blame. This involves fostering an environment where employees feel safe to report issues without fear of retribution.
- Implement Systemic Root Cause Analysis: Commit to comprehensive root cause analyses that focus on identifying and addressing systemic issues, rather than solely blaming individuals.
- Enhance Transparency and Accountability: Encourage transparency and accountability at all levels of the organization, including management, to ensure that systemic issues are addressed and not ignored.
Conclusion
While it is often easier and faster to assign blame to individuals, this approach fails to address the root causes of problems and can perpetuate a cycle of errors and failures. By focusing on systemic improvements and fostering a culture of learning and transparency, organizations can enhance their resilience and effectiveness in the face of challenges.
https://chatgpt.com/g/g-rbe1KJ3k7-nuclear-safety-virtual-assistant
ChatGPT can make mistakes. Check important info.