There’s a common story told in the safety profession to illustrate a kind of measure of the culture of safety within an organization. A lone mechanic finishes a job on a piece of equipment but something doesn’t feel right to him. Unfortunately the equipment is going into a passenger aircraft and if the mechanic is right the plan could crash, killing all aboard. Of course, there’s also a chance the mechanic is wrong and there is nothing wrong with the aircraft. And if he were to go back and check his work it would mean admitting that he might have made a mistake and also halting take off, which is very upsetting and could cost the organization money and reputation.
In the story, the mechanic goes and tells his boss about the problem, halts the take-off and they go and check the piece of equipment. It turns out the mechanic is right, the equipment was not fixed correctly and it is highly likely that had the aircraft taken off it would have crashed. The mechanic is praised as a hero and usually this is shown as an example of an organization with a good culture of safety. (Note - There are different versions of this story told in different environments and industries, some true, some not true.)
That’s all well and good, but we think that if you really want to test a culture to see what it values we should see what happens when the mechanic is wrong. By that we mean, what do you think would happen in the story if the mechanic halts the take off and it turns out there was nothing wrong with the equipment, there was nothing to worry about? Wouldn’t that be a better example of how the organization responds to people bringing up concerns? After all, if they only praise someone when everything works out that’s not that interesting. If they only praise correct concerns then they are reinforcing being correct, not the bringing up of concerns.
It turns out that in the news recently in the US we had a great example of the above scenario. A terrorist threat emerged to schools in major US cities, prompting the Los Angeles Unified School District to shut down its schools. This was no small task, given that the school district is the second largest in the nation. This shutdown occurred at great cost to Los Angeles and significantly disrupted the lives of hundreds of thousands, if not millions of people.
Unfortunately, the threat was identified later as a hoax. The schools did not need to shut down and the next day kids were back at school.
Do you see the parallels with the story of the mechanic?
It will be very interesting to see the response to this situation. What will happen to those who ordered the evacuation? What about those in New York, who received the same threat but didn’t order an evacuation of New York schools? What happens to them?
Obviously all we can do is speculate at this point. But we can say a few things about what we hope happens, because, after all, there may be a few things we can learn from this incident in our organizations. When we have people bring up safety concerns and halt production as a result that is something we obviously want to have happen when there is a legitimate issue. However, we also don’t want people doing this so much that they make it impossible for the organization to do business, and we also don’t want people taking advantage of safety concerns as an excuse to stop a job for another reason. So how can we design a system that allows people to bring up concerns but doesn’t do so excessively?
First, stop rewarding outcomes. Whether the person bringing up the concern is right or wrong should not be relevant. The Mayors of LA and New York should not reward or punish those over the their respective school systems based on the fact that the threat was a hoax. In every situation there are certain aspects that are within our control and other aspects that we have no control over. This means there’s a probability aspect in all situations. You could do everything right and still end up with the wrong result. If you punish someone for bringing up a concern that will be the fastest way to stop getting concerns brought to your attention.
Keep in mind as well that doing nothing is a form of punishment. This isn’t because of anything you did, but because of the situation. Remember the mechanic? It took courage to admit a potential mistake and bring his concern to the attention of his boss. To be wrong on top of that doesn’t feel good. They feel stupid. They are being punished by the situation. So if we do not reward the bringing of concerns in some way we facilitate their punishment by allowing the situation to punish them.
Now this doesn’t mean we have to throw a big party every time someone brings up a concern or give them money or something like that. That may be appropriate sometimes, but you definitely don’t want to do it every time. Even a simple, genuine “thank you so much for bringing this up!” followed by action to investigate the concern is enough to show that you appreciate it. You could even take steps to shield them from any flack the person could get from others by acknowledging your appreciation of them to others.
Second, learn from the process. The first recommendation will get you more concerns brought to your attention, but more is not always better. This recommendation is about getting better concerns. When someone brings up a concern, if possible, take some time to walk through their thought process to identify why they thought it was a concern and what you and the organization can learn from that. The goal is to make the organization smarter and better at identifying concerns. Perhaps there are unrecognized risks out there. Perhaps your employees have concerns about a risk that you know to be minor, identifying a need for a conversation about why there is a gap in risk perception in your organization. Maybe better training or more effective two-way communication is necessary. Maybe your employees need more resources to help them deal with uncertain situations.
Note, this process should not be based on outcomes (see the first point). Whether the concern was justified or not (in your opinion) learning from the process will still be valuable. Also, this learning process should not be about blame at all. Do not turn it into an interrogation designed to find cause. What you really want is information about how to improve your organization’s and your workers’ ability to identify issues in the future. So the discussion should be future oriented. Ask questions like “what can we do to help you better identify issues in the future?” or “what would have made it so you felt comfortable handling this situation on your own or with your work crew?” Again, the idea is to learn and improve, not figure out what went wrong.
If when we have workers bring up concerns and stop work we focus on outcomes and blame the next time we hear about concerns will be during our next investigation of an accident where we will ask the question “why didn’t the worker stop the job?” Instead, if we proactively avoid discussion of outcomes and we learn from the process workers used to decide what was a concern and what merited stopping the job (or shutting down a school district), we will find not only an increase in concerns brought up, but also an increase in risk competence, engagement and trust in our organizations. Interestingly, this not only will help make it easier to create safety in organizations, it will likely improve productivity as well. Not a bad deal – change how you react to workers stopping jobs and you can increase safety and productivity.
PS. Happy holidays from all of us at SCM!