Wednesday, February 26, 2014

What an Idiot! – In Introduction to Systems Thinking

There’s been a lot of talk in the safety world about systems – whether it be management systems, sociotechnical systems, or system safety. However, it is our experience that most safety professionals, while understanding the surface level of systems thinking, don’t really get the implications. This makes sense, given that while the word “systems” sounds cool, it’s sort of abstract. And even if we understand what systems thinking is about, explaining it to people at any level in the organization (except systems engineers, of course) is difficult.

The problem is that systems thinking matters because the models that systems thinking give us more accurately reflect reality, particularly if we want to understand the behavior of people and of organizations. So if you want to actually have a significant impact in an organization in creating safety systems thinking is our current best bet. The only other viable option we have for understanding behavior is a model that focuses on individual action and accountability, or one that focuses on some other individual element.

Most of us know intuitively that any model that is based on only one piece of the puzzle won’t give us the whole picture. For example, most safety professionals know and understand that simply blaming workers for problems is a bad idea. But, in our experience, that’s where the understanding stops and we often see well meaning professionals fail to put this thinking into appropriate action.

This disconnect between thought and action is especially prevalent when someone is dealing with a hard case – a case where it’s really easy to see where the individual screwed up and it’s difficult to see where the system could have done any better. To illustrate this concept it’s probably best to use a case study of an injury accident involving a crane lift.

So the basic details of the incident were that during a crane lift of a heavy piece of equipment the rigger put his hand on the load to keep it from hitting a nearby wall and had his hand subsequently crushed when the load shifted and hit the wall with his hand in between the wall and the load. Now this rigger wasn’t an ordinary rigger – he was a rigging instructor for the company who taught all of his students to never do what he did. So, we can’t say that the organization didn’t train him and that the employee didn’t know better. What an idiot right?

Let’s pause for a second now and talk about systems thinking. A system is an interrelated set of elements organized to achieve a purpose. So, essentially, a system has at least three pieces – elements, interrelations, and a purpose. An interesting point about human behavior as it relates to systems, is that our tendency is to focus on the elements of the system, because they are typically the easiest to see. However, the individual elements tend to have to least influence on system behavior. Rather it’s the interrelations and the purpose of the system that most dramatically affects the system behavior.

Back to the story, our tendency is to focus on the elements of the system, the individual rigger in this case. To illustrate how this is often not as important as other parts of the system, when the safety officer for the company was asked if another employee in that company who had been similarly trained would have made the same mistake she responded “yes.” So if we changed the element to another one we get the same result – that’s not a problem with the individual, that’s a problem with the system.


What the safety officer captured in her answer was the truth that blaming and disciplining the individual in this case for a clear violation would not fix the problem by itself. Something deeper in the system led to this accident and if we really want to prevent the next accident we need to put on our Sherlock hats and find what it is. And here’s the thing, because the system is organized to achieve a purpose, what we find in the system that contributed to the accident may not be inherently bad. It may be normal organizational decisions and processes that on any other day wouldn’t have led to any problems, but today they worked together in just the wrong way. But if we can understand those interrelations we may be able to find a way to have the system still achieve it’s purpose, while at the same time not allowing the conditions that can lead to unacceptable risks come together in just the wrong way for both this accident and any other accident that may have happened.

No comments:

Post a Comment