Tuesday, November 4, 2014

After an Accident, You Can Either Learn or You Can Blame – You Can’t Do Both


NASA had another mishap, this time without any human casualties, but they did an estimated $200 million in damage after the Antares rocket, whose mission was to carry supplies to the international space station, exploded on take off. Obviously the investigation is ongoing. As Neil deGrasse Tyson, of Cosmos fame, noted on his Twitter feed on the day of the incident:


However, we were a bit dismayed when we noticed this headline from the Huffington Post UK about the incident:



The title of our blog post comes from a presentation by Todd Conklin, who makes the case that our dual needs following an accident to blame and to learn are conflicting. The more you do of one the less you do of the other. He joins a chorus of other safety researchers and practitioners who make similar assertions, such as Sidney Dekker, Erik Hollnagel, David Woods, Nancy Leveson, David Borys, Michael Behm, Daniel Hummerdal, Richard Cook, and many, many others.

We’ve also talked about it many times in the past, both in our training and consulting activities, as well as in previous blogs (here, here, here, here, and here). However, many people still feel that it is possible, and some even feel it is necessary, to both learn and blame during an accident investigation. Since this idea is so prevalent in our culture, and, we believe, this idea is so caustic to safety we though it would be appropriate to directly explain why we believe that learning and blame are negatively correlated.

First off, we must be clear, our argument does not consider any legal or philosophical issues (although, we feel that our argument could stand up to scrutiny in both of these areas). All we are saying right now is that from a purely pragmatic perspective, if we want to improve safety, and if we want to improve safety then we must improve our ability to learn, which means we must also remove blame from our investigation processes.

Here’s why:
  1. As we mentioned in our last blog post, Erik Hollnagel has two (rather unfortunate) acronyms – WYLFIWYF and WYFIWYF, or What You Look For Is What You Find and What You Find Is What You Fix. Essentially, what he’s saying is that our current understanding of the world has a significant influence over what we look for and what we see in an accident investigation. So if we are looking for blame we will find it, and if we find blame that’s what we fix – blame the individual and fix them by either getting rid of them or discipline. But if our current understanding heavily influences what we are seeing, doesn’t that mean we are inherently biased? We expect to find blame, therefore we find it, and when we find it our minds are motivated to stop there because we’ve found what we’re looking for. We don’t need to look further because we found what we were looking for. So a focus on blame inherently puts blinders on and limits the results of an investigation to identifying and fixing problems in individuals.
  2. Of course, one could argue that just because blame tends to limit investigations doesn’t mean that people can’t move beyond those biases and still achieve learning. However, the assumptions that underlie our tendency to blame inherently conflict with a systems viewpoint of safety. To use a very basic definition, a system is a set of interrelated components. We tend to focus on the components but the most important part of any system is the relationships between components. To give an example, if you want to change a football team the least effective way is to change any one player on the team. You might change the team a little bit, but a more effective way to change the team is to change the rules of the game, which changes how the team members relate to one another. A focus on blame inherently focuses on components (the person who messed up) without consideration of relationships that influenced their behavior. With a blame focus, we are setting ourselves up to have the least effect on the behavior of the system, and therefore the least effect on system safety.
  3. Blame inhibits the investigation process. When we go into an investigation with the goal of finding fault (blame) the people we are investigating inherently know this. Therefore, what do they do? They withhold information that they feel might be used against them. This means that investigators can never get the full story of what happened. They can piece together hypotheses of what happened through reviews of evidence and other witness statements, but the only information they have will be inferred and they will never know the full perspective of the people involved. So, with a blame focus our investigations will always be incomplete and therefore our ability to learn from the event will be crippled.
  4. Finally, blame inhibits future learning. People learn that if something goes wrong they can be held liable. Therefore, if something goes wrong and they think they can get away with not reporting it they won’t report it. The gap between how we imagine things are going in our organizations and how our organizations actually are operating will widen. When we blame following an accident we are actively separating ourselves from the messy details of the normal operations within our organizations…that is, until the next major accident happens. 

Don’t get us wrong – there are potentially reasonable reasons to blame following an incident. If someone violated a rule it makes sense to punish them to send a message to others. However, we have to weigh the need to be consistent in our disciplinary policies against the need to build a learning culture. In our opinion, we agree with Neil deGrasse Tyson – learning must take precedence. As Ivan Pupulidy says – before an accident the normal accountability structures that exist within the organization apply. After an accident the organization is accountable to learn.

2 comments:

  1. Have a look at Reasons Decision Tree for Culpability of Safety Incidents.
    "Managing the Risks of Organizational Accidents", - James Reason

    ReplyDelete
  2. Thanks Robert. Reason's decisions tree is one that has been used in many industries, such as aviation, to facilitate the creation of a "just culture". Our concern with it is that often the process is used in unjust ways. When we allow for the idea of blame finding in an investigation we limit people's willingness to be open, leading to problems with organizational learning on both the micro and macro level.

    ReplyDelete