Thursday, July 16, 2015

Moving Beyond Our Need To Do Something

At recent meeting of contractor safety representatives at a large construction site each of the contractors was reporting on the incidents they had in the past month. (This practice is common but comes with its fair share of problems. If not dealt with delicately, it can quickly become a shaming exercise where those contractors with the most incidents are seen as the worst offenders, leading to underreporting.) One contractor told the story of an employee who made a mistake and the result was an incident where a forklift tipped so it was balancing on its front wheels. No one was injured, luckily, and the forklift was safely brought back onto its other two wheels and put back in service. The contractor then reported that the employee was sent home for a few days without pay and retrained as the cause was a “behavioral” one. Everyone agreed with the response (except us) and the meeting moved on without incident.

What’s the point of investigating incidents? Most would say that it’s to make sure that the incident never happens again. So in this case the employee allegedly made a mistake. They didn’t intend to do what they did and they certainly didn’t choose to tip the forklift over. The incident was not intentional. The employee did not weigh all potential actions and their consequences before acting. He just did what he always did, except this time he screwed up.

So how does punishing him fix this? Now, some of you will quickly respond that it will give him something to think about next time. Next time he’ll remember to be more careful, right?

Not so fast. First, we have to remember that the employee wasn’t thinking actively about his decisions. He was working like we all do – using what we could essentially call muscle memory to get a job done that he does day in and day out. Think about your work – do you actively think about every job step when doing jobs that you do every day that you know you’re good at? Of course you don’t. And before you say that this was a high risk job, we would challenge you to think about all the times you’ve done high risk things without thinking (e.g., pretty much doing anything while using your mobile phone). Let he/she who is without sin cast the first stone.

Of course you probably learned not to do those things, but that leads to the next point. Think about the last time you made a big mistake. Didn’t you feel bad and start to think about all the things you could do differently next time to ensure it didn’t happen again? So you essentially punished yourself and looked for ways to learn from your mistake. Why would this employee be any different? If we buy into the idea that punishment will have an effect on behavior (which it does), why is our punishment necessary when the employee is punishing himself?

The thing is, the punishment really doesn’t do much, if anything at all to decrease the likelihood that this incident will happen again. In fact, it probably makes things less safe, because now employees are incentivized not to tell management about mistakes (because they get punished when they do). Now that loss of reporting could be worth it if we could prove that the punishment changed the behavior. But we have no real evidence that this is the case. But won’t the punishment send a message to other employees? Yes, it will send the message that mistakes are punished, so employees should hide their mistakes. It won’t lead to less mistakes though. So we have real evidence that punishment can make things worse and no real evidence that it makes things better.

Why do it then?

Because if the contractor that recounted the incident and was asked what they were doing in response had said “nothing, the employee has learned his lesson already” they would, at best, get some strange looks. At worst they would be chastised for not caring about “safety”. But the issue isn’t about caring about safety, it’s about caring about looking like we care about safety. If we look like we are doing nothing then we look like we don’t care. That’s bad.

But the reality of the situation is that the reason we had an incident is that we had a system that relied on human reliability – i.e. people to not make mistakes. Now keep in mind that any such system will, on average, be a very reliable system, because people don’t screw up that often. But they do screw up occasionally. So eventually you will have a failure. The same is true for any system in your organization that relies on human reliability. If you have people who climb ladders, drive cars, put widgets together, drive forklifts, use tools, do work, etc. eventually you’ll have an incident where a person makes a mistake.

That’s not surprising to anyone, but what should shock us is we build these systems where, if we think about it, we know that they will fail eventually because of some version of “human error” (or what some have called “performance variability”) and then we get shocked at how careless the person could be to make that mistake and we try to punish it out of them. We aren’t really making anything better, because we put people back into the same system so they can make a mistake again later (and then we can punish them again). But at least we can say we did something, right?


The time has come for the safety profession to break this vicious and unjust cycle. We need to stop putting people into situations where they are highly likely to fail eventually and then punishing them when they do, simply because we don’t want to be seen as doing nothing. The fact is that there will be times when we need to rely on human reliability. When an incident happens in these systems we need to let go of the need to do something just for the sake of doing something. Rather than trying to fix the problem, we should help the people we’re relying on learn from the event. Don’t focus on what the people did wrong, focus on how you can help them do better next time. Punishing them doesn’t accomplish this. We trust them to reliably create safety day in and day out, we should start trusting them to learn from incidents as well.

Wednesday, July 8, 2015

A Day Like Any Other

It was a day like any other. A maintenance crew received their work assignment at a chemical plant. They were to fix a leaking valve in a pipeline that held a corrosive chemical. These sorts of leaks were pretty common, as, like so many other plants, this plant was pretty old. However, the chemical they manufactured is a specialty material, and demand was never really that high…until recently when the chemical became a very necessary raw material in high tech products. As a result the plant did not invest much in capital improvements for most of its history. But with the demand increase, the company was asking significantly more from the degrading infrastructure, which was leading to more and more problems. In recent years the plant had a rotating door of plant managers (nearly one per year), which created an environment where long term investment seemed more and more unlikely.

But still, the maintenance crews had to make due with what they had, and they had done a pretty great job. At this plant the maintenance department had gone many years without a serious injury. So they got their needed work permits and began work with the operations department to start isolating the valve.

This was a bit of a touchy point for the maintenance folks, because a recent drive to improve efficiency at the plant led to cut backs in personnel in the form of early retirements. In one day alone, a couple years ago, 500 years of experience walked out the door. So the operations department was young and, on average, inexperienced. Furthermore, the plant did not have a good training program to train new staff on the process, so operations trainees had to learn on the job. This didn’t sit well with the maintenance crew, as they had to rely on operations to tell them where to put their locks to isolate the value. Nor did the plant have updated pipeline and instrumentation drawings that the maintenance and operations crews could reliably use to get the information they needed to do the job safely.

So the crews did the best they could. The job was completed, the valve was replaced, and everyone got to go home that day without injury.

Kind of a boring ending to an interesting story, right? Often when we tell stories like this in the safety profession it ends with explosions, mayhem, disfigured bodies, and broken lives. It makes sense that we would tell these sorts of disaster stories, because they are so interesting and because we believe they give us an opportunity to see what causes failure in our organizations. Accidents, after all, are the things we want to prevent, so if we study those accidents in depth we should be able to find ways to prevent similar accidents from happening in the future, right? So, by looking at accidents we can learn about safety.

However, in many ways the practice of looking at accidents alone to learn about safety is problematic. First, this assumes that to understand failure you only need to look at failure. To understand accidents we only have to investigate what failed. Furthermore, since we use accidents to help us know what we need to do in the safety profession so much, to understand safety, we only need to look at accidents.

This is simply wrong. How can you understand how something failed without understanding how it normally works? Now, before you answer that question, make sure you read it carefully. We aren’t saying that you need to understand how it is supposed to work. We are saying that you cannot understand failure without understanding how it actually works. This difference between how things are suppose to work (sometimes called “work as imagined”) and how things actually work (sometimes called “work as performed”) is huge and we have yet to see an organization that does not have a significant gap between the two. In almost every case though, management (including the safety professionals) are oblivious to this fact.

Second, only looking at accidents to know how to improve safety is problematic because it is so reactive! In the story above nearly every detail we mentioned (which is all true by the way) is already present and is common details we see in accident reports. This means, as Todd Conklin likes to say, that the seeds of the next accident are already sown in the organization. They can either wait until the accident happens to go out and find them, or they can go out now and find them before the accident happens.

One of the biggest gaps in the safety profession today (and in all professions involved in management) is a significant misunderstanding of everyday work. Organizations consistently misunderstand how they achieve success day in and day out, which leads to a lot of wasted resources and often a lot of blame. The Law of Fluency predicts that the adaptations of people will hide the things they are adapting to. This means people will fill the gaps in our organization, which will hide the real problems, leaving only the actions of the people in our site (so when they screw up it will be easy to see and blame them). This is where organizations tend to see a lot of drift emerge.

As safety professionals we need to change our focus, away from only focusing on what has and can go wrong, to understanding what goes right. What is actually making it so that most days your organization has no accidents? What is causing that? Shouldn’t that interest us? If we figure that out, then we can begin to find ways to shore up whatever that defense is, while also facilitating performance (thus reducing, or even eliminating the safety-production conflict).

Not sure how to do this? Here’s some ideas to get you started:
  • Get out and talk to your workers. Ask them what makes getting the work done hard. Ask them what do they have to overcome to do their jobs. Do this without teaching them. Just listen and ask yourself what this says about what is making your organization successful.
  • Perform success investigations. Take a job, project, design, etc., and investigate it like you would an accident. Where did the work deviate from the plan and why? Where did the workers have to adapt the work plan to get the job done and how can we help plan better in the future?
  • To understand accidents, first understand how the work normally works. Don’t simply look at what went wrong in an accident investigation. Try to figure out how what normally goes right, went wrong in this case. This means you need to understand how things normally go right.


Wednesday, July 1, 2015

It’s Time For An Anthropological Approach to Safety Culture


One of the holy grails of safety management appears to be safety culture. Many organizations speak of the need to build a safety culture in their organizations, or of how the overarching goal of any safety initiative is to build a culture of safety. There are LinkedIn groups devoted to building a safety culture and to improving safety culture. It seems that almost everyone agrees that building a strong safety culture is the goal of our profession.

Now, when you ask people what safety culture is, this where it starts to get fuzzy. Sometimes you hear people say that it’s how people in the organization behave in terms of safety, i.e., the decisions they make, the risks they take, etc. Some people refer to culture as the sum of the habits people have. Others use that classic definition of “the way things are done around here”. Some believe safety culture is something you have to create, whereas others think that every organization has a safety culture, it just has to be influenced.

Fair enough. How do we build a safety culture or influence a safety culture to achieve the Valhalla of safety that culture promises to be? This is where we get even more confusion and disagreement. Sometimes people say it’s through engagement of workers, or putting up motivational posters (or even posters of their kids, to motivate them not to kill themselves). Sometimes it’s engagement of leadership. Maybe it’s training.

Honestly though, when you look at the approach the safety profession takes toward culture, only one thing seems clear – there is no clarity.

In the safety profession, if one of our employees faces a work situation where they lack clarity, where they are uncertain, or where the way forward is unclear, almost without question we would advise them to get clarity by going to an expert on that task who can advise them. It’s funny that most people reading this agree with that strategy, but the safety profession fails to take its’ own advice! If we really want to know what culture is, why haven’t we talked with the scientific body that coined the term and whose whole purpose is to study culture – cultural anthropology?

Now, don’t get us wrong, any anthropologist will be the first to tell us that they haven’t figured out culture, nor do they have one universal definition that they agree on either (one definition we like for its brevity is shared patterns of learned behavior ). But they at least can tell us where culture comes from, how it affects individuals, and how it changes. Isn’t that exactly what we’re looking for in the safety profession?

For example, looking through an anthropological lens, culture is an adaptation that groups of humans used in order to solve problems in their environments. Our human ancestors gained an advantage by working together, which allowed them special access to food, safety, and the ability to reproduce that those who went alone did not have. Culture helped to keep these groups together and influenced the behavior of individuals within the group that helped secure that advantage.

If this is true, there are some general implications. First off, culture is context specific. Culture developed because it helped people solve the problems they faced in their environments. This means that a cultural adaptation is specific to that environment, which may mean that the culture may not work in another different environment. This leads to the next implication.

Culture is neither good nor bad. There is no universally good or bad culture. Cultures may be “better” (more specifically, more adaptive) because they help the members of that culture adapt to their environment more efficiently. But that same culture may not work in another environment. So you cannot compare one culture to another without considering their context as well.

Now this may sound like weird academic speak, but if the above is true, it has some important implications for the safety profession. Here’s some we’ve thought of, although there may be more:

  1. All organizations have a culture. It naturally emerges as groups within the organization adapt to solve the problems they face. So it is impossible to “create” a culture without considering what culture already exists.
  2. Perceptions surveys alone are poor measures of culture. Surveys measure perceptions of individuals at a given time. They strip away those perceptions from their context, which means you’ve completely lost the ability to understand culture. Certainly those perceptions are a part of the culture. But if you stop there then it’s like studying hand clapping by only looking at one hand.
  3. Comparing cultures using some normative scale makes no sense. Culture is context specific. So saying that one culture is better than another only works if both cultures are operating in exactly the same environment. If not then you’re comparing apples and oranges. Any organization that compares your culture to someone else’s to tell you which is better is simply not measuring culture. (This doesn’t necessarily mean that the comparison is useless, but it does mean we should figure out what we’re measuring before we change anything.)
  4. To understand the culture, understand the context. One of the most common forms of research in anthropology is ethnographic research, which involves observing the culture in its environment. Why don’t we do this in safety? Why don’t safety professionals and managers who want to understand the culture get out and see the world as the workers see it? Identify the problems workers have to solve, the realities they face, and you’re well on your way to understanding the culture.
  5. If you want to influence a culture for the better, find ways to help the culture better adapt to its’ environment, i.e., help the culture more effectively solve the problems and negotiate the tradeoffs. To put it another way, increase the potential for resilient performance. This could involve ensuring adequate resources (or that scarce resources are used properly) or helping individuals identify relevant cues (and disregard irrelevant cues) to help them make better decisions. Provide more adaptive capacity to the real and potential problems workers face and you will facilitate the creation of resilience, which strengthens the culture in that environment.