An employee at a client’s site was injured recently when their hand was hit while sand blasting (using high pressure beads, called “sand” to remove paint or other coatings from a surface). The injury was relatively minor compared to what could have happened, he only had to have sand and dead tissue surgically removed, so the most that will likely happen is he’ll get a cool scar and nice story out of the deal.
That’s about as much as we know about the incident though, because at this point the story starts to get fuzzy. The injured employee says that he was hit by the blasting of another employee when the injured employee put his hand through a hole in the wall, while the other employee says he wasn’t even in the area at the time, so it couldn’t have been him.
The automatic response to this is so normal that no one questions it anymore – get the employee drug tested, consider retraining, consider disciplinary action for the employee who lied (depending on which one we identify in the investigation as the liar). This should allow us to get things back to normal by fixing the problem. Just another day at the office, right?
Sometimes so much learning can be gained by simply asking a stupid question. For example, why would an employee not tell the truth about what caused an accident? Because they don’t want to get into trouble, obviously. Look at the list of actions following the incident – they all involve finding problems in the person and fixing those problems.
What we spend our time on is an indication of what’s important to us, and in the case of an accident, what we spend our time on is an indication of where we think the source of the problem is. As Erik Hollnagel says – What You Look For is What You Find, which is followed closely by What You Find is What You Fix.
What’s the big deal here though? There’s clearly some level of misconduct in the case above, so someone needs to be punished right? That’s where the problem comes in – so many people when they read that story automatically jump into the find and fix the person mode. This is an indication of our habitual blame response, particularly following an accident. But lets take a step back and ask another stupid question – why would we blame someone following an accident? Two reasons come to mind:
- To prevent the accident from happening again; and,
- To hold people accountable to the rules and procedures we have in place.
But here’s the thing - for number 1 to occur we first have to identify what happened to begin with. To put it another way – number 1 only happens if we create an environment that facilitates learning. And this creates a problem when you realize that the second reason to blame can conflict with the first reason. Put simply – our blame reaction gets in the way of our ability to learn. Why? Because when we start the investigation by looking for what went wrong with the people involved should we be surprised when their reaction becomes defensive through the hiding of information? So we’re left with a choice – we can either learn or we can blame following an accident, but it’s really hard (perhaps impossible) to do both.
And this is another example of how, in the safety profession, our reactions, our solutions to problems confound us. Examples abound:
- We spend most of our time at our desks (no where near the worksite) unilaterally coming up with policies and procedures that dictate how employees should do work that we’re only partly familiar with and then we’re surprised and frustrated when they don’t do it that way.
- We focus on preparing for the next major audit from corporate or the regulator by getting all of our paperwork in order, and then we’re shocked that others in the organization don’t appreciate our sacrifices in the name of safety.
- We develop training classes that are designed to meet regulatory requirements to the letter, but have little relevance to the actual work employees do, and then we get angry that employees look for excuses to get out of the class and those that do show up fall asleep.
Our solutions are the problem. We’ve separated ourselves, both physically and mentally, from those we are duty bound to protect and then we are surprised when we do not understand why they do what they do. We forget that the people we work for are people. They are not dumb animals and they are not machines. They have goals (safety being one of many), they have feelings, they have creativity and ideas that may solve your problems.
But the tools safety professionals use on a daily basis blind them to these facts. We look at the world in reductionist, linear, and bimodal ways and these lenses keep us from seeing the complexity that our workers at all levels in the organization have to deal with. And when we have this separation between how we imagine our organizations are working and how they are actually working, is it any wonder that the numbers of people getting hurt and killed at work aren’t having as much benefit as they used to? The rates of people getting killed at work in most countries have become asymptotic (LINK), and asymptotes point to dying strategies. We need new strategies to deal with the new world our workers are facing.
Here are some ideas that can get you started:
- Take a hard look at all the things you do every day and ask what that says about your priorities. Sometimes our conscious priorities and our unconscious ones don’t line up. How can you bridge any gaps you find?
- If you want to up your “worker engagement” then get out from behind your desk and start engaging your workers. It starts with walking around, observing normal work, and having real conversations with people. And we mean real conversations, not teaching moments. Your workers have things they can teach you about the job and about themselves. Go out there and learn.
- Seriously consider removing blame from your investigation process. What if it was impossible to blame individuals after an accident – what would the result of your investigation be?
- We need to start conversations within the safety profession about the development of new tools that allow practitioners to identify drift within their organizations. We have tools for hazard and risk assessment but at the organizational and decision-level we leave it to the workers, until after the accident, when we blame them for not knowing better. Right now the only ones we’re aware of are either pretty complicated, or they are proprietary (which means that the only evidence we have that they work are marketing materials). This isn’t good enough. We’ve got a lot of smart people in our profession, but we need to start the conversation.