Tuesday, October 6, 2015

More is Different

On January 27th, 1986, contractors and employees from NASA met in a last minute conference call to discuss concerns related to the launch of the space shuttle Challenger the following morning. Some engineers had expressed concern over the record cold temperatures expected before launch the next morning and the effect those temperatures would have on the solid rocket boosters. However, the contractor and NASA employee team looked at data and concluded that the shuttle had flown in cold temperatures before and had returned every time without serious incident. So what harm could a little more cold do?

On January 16, 2003, NASA personnel reviewed footage from the liftoff of the space shuttle Columbia where they noticed a large piece of foam falling off the external tank and hitting the left wing of the shuttle. This piece of foam was one of the largest pieces of foam to strike the shuttle during launch, but foam coming off during launch was a relatively normal occurrence, and had since undergone a risk assessment process and been determined to not be a safety issue. So what harm could a bigger piece of the same foam do?

Obviously, in both cases a little more was just enough to cause disaster in both cases, resulting in a total loss of the crews and the shuttles. Both cases have had very public investigations, sometimes with conflicting factors identified (some say amoral managers, some say organizational causes, some safety normalization of deviance). One common theme though is the inability to see how sometimes doing the “same” thing can lead to different results.

The title for this post comes from the title of a 1972 article where the author makes the case that the systems are often completely different than the sum of their parts. This makes the typical approaches of science, i.e., reductionism, invalid. You can’t explain the behavior of the whole by taking it apart like a clock and figuring out how each piece works individually. When you put the pieces of a complex system together you get behavior that is only understandable and explainable by understanding how the pieces work in concert. You can understand how one part works, but when you add another piece you have changed things. More is different.

Another way to explain this is to understand the concept of phase transitions. Phase transitions are essentially how matter changes states, such as from solid to liquid to gas to plasma and back again. So let’s take the example of a piece of ice kept at a temperature of -1˚ C (30˚ F). It is a solid material. Now add one degree. Now the temperature is -0˚ C (32˚ F). Not much has changed. We still have ice. Lets add one more degree. After all, we added one before and nothing changed, so adding one more shouldn’t have any significant effect, right? Now we’re at 1˚ C (34˚ F), and what do we have? Water. We’ve significantly changed what we’re looking at simply by adding more of the same. We added more and we got something different.

Obviously this is a bit of a simplistic example, but it illustrates the power of adding more. Sometimes this is called drift or normalization of deviance. These processes should be very troubling to us as safety professionals. If more is different, we must contend with the fact that perhaps the seeds of the next disaster in our organization will not be found on even the most robust hazard and risk assessment (in the cases above, NASA conducted rigorous risk assessments for both shuttle launches and identified the risks as acceptable in both cases). Even management of change processes often won’t catch this because often adding more is simply adding more of the same, so there is no perceivable change.

The causes of the next disaster could be found in the normal, boring, messy details of everyday work. It could be in us applying the same methods we’ve always applied to get jobs done, using the same old tools, working with the same people. The same things that have been successful in the past could cause failure this time. Safety professionals often don’t have the tools necessary to identify these things, because we often are only focused on finding negatives – such as deviations, hazards, risks, etc. But applying more of the same can’t be seen by only focusing on negatives… with one exception.  Unfortunately, the only tool in the traditional safety person’s arsenal that might identify this problem is the accident investigation (and even that is a bit of a stretch if we’re honest). But do we really only want to wait to find the next major event in our organization until after it happens?


We obviously need a better way forward if we really want to make progress in preventing the serious injuries and fatalities that have become so difficult to reduce for so long. Interestingly, the one tool in the traditional safety professional’s toolbox that can help us identify the tinderbox that may be hiding beneath the surface of adding more of the same (i.e. normal work), investigation, hints at a potential way forward – learning. Some things to consider for your organization:
  • How is safety defined in your organization? If it’s defined as the absence of accidents, you may have some work to do. Every job you complete without an accident means that it was a safe job, regardless of how it was done. This limits your opportunity to learn because there’s nothing to learn if everything worked out. Instead start asking questions about what it takes to do that job safely tomorrow and the next day.
  • When do we learn? If safety is not just the absence of accidents, this opens up all sorts of opportunities to learn that weren’t there before. This means that jobs can be done successfully (by some definition) but still be “unsafe”. Essentially this means that every job is a learning opportunity, regardless of the outcome. Do you have structures in place that allow you and the leadership to learn about how work is actually performed? Or do you just assume that everything happens according to plan? Spoiler Alert – Nothing happens according to plan. So start finding ways to learn from everyday work.
  • How are people who bring up conflicting views treated? It is remarkably difficult for insiders to identify a problem of adding more of the same. For this reason when you have people in your organization who bring up conflicting views (we have them in every organization), those people must not be treated like the proverbial “boy who cried wolf”. Yes, these people can be frustrating, but if you chastise people for bringing up conflicting viewpoints you can’t be surprised when you stop getting conflicting viewpoints brought to your attention. This will kill your ability to identify the risk of more of the same.

No comments:

Post a Comment