top of page
The  iluli by Mike Lamb logo. Click to return to the homepage
The iluli by Mike Lamb logo. Click to return to the homepage

Nobody Wins the Blame Game

When things go wrong, humans love to find someone to blame. It often seems to make sense, because it feels both fair and effective — people who screw up get punished and everyone learns not to make the same mistake again.


But what if we’ve been getting this all wrong? Pointing the finger of blame might feel satisfying, but it can stop us from learning. And when you consider that we all make mistakes, that’s a lot of valuable learning we could be missing out on.


Fortunately, there is an alternative to the blame game: “Just Culture”. An impressive array of organisations and industries believe that it represents a fairer and more effective way to deal with and learn from our errors.


Just Culture proposes that instead of obsessing over “Who’s to blame?,” we should shift our attention to “How did this happen?”


Watch my short explainer to find out more…



Why blame is a losing game


You don’t need to look far to see blame culture in action. When every single flight in the US was grounded earlier this year, it was pinpointed to a mistake by a single engineer. A few weeks earlier, international panic was triggered by an erroneous news story claiming that Russian missiles had been launched into Poland. It was blamed on bad reporting by one journalist who was subsequently fired.


In the video, I talked about developer Dave who works for a bank. To recap, Dave caused chaos when he accidentally sent a message to customers warning that they had been victims of a data breach. He was only meant to be testing the email system. The consequences of Dave’s mistake were significant for the business. Shareholders demanded that heads should roll, and Dave’s was first on the block.


This is the blame culture playbook — culprit identified, disciplinary action taken, problem solved (putting aside an employees’ previous track record and the lost potential for them to help fix the problem).


Now, imagine that you and Dave had worked in the same team.  A few months later, you make a similar mistake and a test email finds its way to customers’ inboxes. Thankfully this is just a routine marketing email rather than a data breach warning so there are no angry calls from bosses and shareholders demanding action. What do you do next?


As leading academic and part-time pilot Sidney Dekker sees it, you’ve got two options and neither one of them is good:


1. Report the mistake and accept that you’re going to get into trouble. You might lose your job or receive a reprimand. Either way, you’ll carry the stigma of being singled out for messing up.

2. Stay quiet and keep your fingers crossed that no one notices.


We might all like to think that we’d take option one. But, in a blame culture where people aren’t confident of being treated fairly, you could be forgiven for being tempted by the alternative. After all, we saw what happened to Dave.


In his book Just Culture, Dekker explains that holding people accountable and blaming them can be two quite different things. A blame culture creates a climate of fear which doesn’t help anybody do their job better and makes them more likely to cover up mistakes. The opposite of this is a blame-free or no-fault system. Dekker believes that by making people more inclined to give an honest account of failures and take responsibility for fixing them, those who make mistake are held more accountable.


Ultimately, blame is the enemy of accountability and can be an obstacle to putting mistakes right.


A graysclae image of author Sidney Dekker, surrounded by cartoon images of books, stacks of paperwork and a plane.

Bad apples or broken systems?


In The Field Guide to Understanding Human Error, Dekker shares:


Underneath every simple, obvious story about ‘human error’, there is a deeper, more complex story about the organisation.

Why do accidents happen? Is it because of a small number of "bad apples" – a minority of people who are careless, error-prone or don’t follow the rules?*


Or is it the case that everyone makes mistakes, and sometimes these mistakes escalate into accidents when a process or system fails?


If you favour the bad apples explanation, then blame culture makes sense. As the old saying goes, “one bad apple spoils the barrel.” Stop mistakes from happening by getting rid of those who make them.


The bad apples theory may be appealingly simple. But it doesn’t work. This is because we all make mistakes (“to err is human,” after all). And punishing mistakes doesn’t stop them from happening. In fact, it might even make us more prone to errors.


We’ve already established how a blame culture reduces accountability. Dekker also talks about the impact of mistakes on the "second victim". If the first victim is the person who suffers the direct consequences of an accident, the second victim is the person who caused it. Or, at least, the person who ends up getting the blame because they were the last link in a chain of errors.


A cartoon image of characters on a board game, each pointing the finger of blame at the next person, in a clockwise direction.

Second victims


A few years ago, morale at Mersey Care, an NHS mental health trust, was very low. Staff do the difficult and important job of caring for some of the most vulnerable people in society. The work is challenging as it is, but the stress reached breaking point for some when they were the subject of internal investigations. Employees wanted to fix issues and provide the best care for patients – it’s why they do the job – but the organisation’s approach to dealing with these issues left them feeling stigmatised and hurt. Even after investigations had concluded, the psychological toll on them remained.


One of the team had an epiphany reading Dekker’s book on Just Culture.


Dekker accepted an invitation to visit the trust and help guide them towards becoming a Just Culture workplace. In a film made during the visit, employees spoke with remarkable candour about the guilt, shame and depression they suffered after being on the receiving end of complaints and allegations.


The "bad apples" blame culture approach had failed. Senior management concluded that no matter how thorough their investigations were, they were failing to get to the root cause of mistakes.


Dekker posits that while the top priority must always be supporting the first victim (the person who suffered when things went wrong) it is important to support the second victim too (the person accused of making the mistake).


Mersey Care believe this approach has been transformational. By better supporting "second victims" they are learning more about how to improve care. Staff are actively involved in helping others to avoid their mistakes and are resuming work without the psychological burden of the blame game still hanging over them.


Their local MP Rosie Winterton summed it up:


We actually keep the experience, we learn. Patients will be safer and treated better. We all do well. What isn’t there to get?

Explaining disasters with Swiss cheese


In his book, Outliers, Malcolm Gladwell states that the typical plane crash is the result of seven consecutive human errors. Taken individually, any one of these errors would have been unlikely to do much damage. It’s the one-in-a-million-or-more unlucky combination of all seven that brings a plane down.


Mistakes are inevitable and the role of systems is to prevent them from becoming disasters. This begs the question — how do we make sure our systems do this?


James Reason, Emeritus Professor of Psychology at the University of Manchester, is a world-leading expert in human error. He likens the system failures which cause disasters to Swiss cheese.


The Swiss cheese model of accident causation illustrates that, although many layers of defense lie between hazards and accidents, there are flaws in each layer that, if aligned, can allow the accident to occur. In this diagram, three hazard vectors are stopped by the defences, but one passes through where the "holes" are lined up.

This model is now widely used in risk management. In Humble Pi, Matt Parker explains:


The Swiss Cheese model looks at how ‘defences, barriers, and safeguards may be penetrated by an accident trajectory’. This accident trajectory imagines accidents as similar to a barrage of stones being thrown at a system: only the ones which make it all the way through result in a disaster. Within the system are multiple layers, each with their own defences and safeguards to slow mistakes. But each layer has holes. They are like slices of Swiss cheese… Sometimes your cheese holes just line up.

Since James Reason developed his Swiss Cheese theory in 1990, ever-more-sophisticated studies of accidents identified an important addition – sometimes layers and steps within a system can actually be the cause of accidents.


Reason’s original analogy has now evolved into a "Hot Cheese" model. This places the cheese wedge on its side, with mistakes raining down from above. Only mistakes which make it all the way through become accidents. But now there’s a new risk — the slices of cheese are hot and liable to drip down too. Adding more steps into a system designed to minimise risk can have the opposite impact.


A system failure for your consideration


The 2017 Academy Awards will forever be remembered for one of the most excruciating gaffes in showbiz history. The ceremony culminated with Warren Beatty and Faye Dunaway awarding the coveted Best Picture Oscar to La La Land. The team behind the musical were part-way through their acceptance speech when it was declared that there had been a mistake — the winner was actually the Barry Jenkins-directed film Moonlight.



So how did Beatty and Dunaway end up opening an envelope containing the name of the wrong film? This wasn’t just a careless oversight — the Academy Awards organisers are so determined to get their envelopes right that they put the accountancy firm PricewaterhouseCoopers (PwC) in charge of them. PwC operate a complex system which includes a series of "back-up envelopes" for every category. Supposedly, nothing is left to chance.


But it was a back-up envelope from the previous award which found its way into the hands of Beatty and Dunaway at the critical wrong moment. An envelope, remember, which had only been printed up to prevent this kind of embarrassing blunder from ever happening.


After the debacle, PwC declared that the two people responsible for envelopes on the night would not be fulfilling that role again. And they introduced some additional safeguards, creating an even more complex process of envelope checks (or more slices of Swiss cheese, if you will), to avoid future mistakes.


What could go wrong?


 

Recommended links and further reading


  • Eric Cropp Discusses Medical Error that Sent Him to Prison (Pharmacy Times) A mistake made by pharmacist Eric Cropp led to the death of two-year-old Emily Jerry. He was struck off, vilified and imprisoned for manslaughter. Since his release, he has become an advocate for patient safety. In a remarkable example of Just Culture being put into practice, Emily’s father now works with him to give talks and workshops aimed at improving systems and processes for doctors, nurses and pharmacists. “Because of Cropp and Christopher Jerry speaking together and separately, many hospitals have made changes to reduce their error rates.”


  • Blameless PostMortems and a Just Culture (Etsy) An old but interesting account of how the online shop Etsy created a Just Culture, starting with "blameless post-mortems," which encourage engineers to give a detailed account of errors without fear of punishment. Etsy’s former Chief Technology Officer John Allspaw writes: “A funny thing happens when engineers make mistakes and feel safe when giving details about it: they are not only willing to be held accountable, they are also enthusiastic in helping the rest of the company avoid the same error in the future. They are, after all, the most expert in their own error.”


  • Sidney Dekker’s 2007 book Just Culture: Balancing Safety and Accountability. The second edition of Dekker’s book was used as the source material for this newsletter and video. He has since produced a third: Just Culture is an evolving model.


  • The 2017 Oscars debacle is unpacked in more detail in a great episode of Tim Harford’s Cautionary Tales podcast.


 

* For this article, our hypothetical "bad apples" are people who make innocent mistakes without intending harm. In other contexts, the bad apples metaphor can have a wider and more sinister application — people whose wrongdoings are committed with malicious intent. Former US Defence Secretary Donald Rumsfeld once famously blamed horrific war crimes committed by US troops at Abu Ghraib on "a few bad apples". Just Culture would not look to excuse or explain this type of behaviour. For more on that topic, this article in the Scientific American is well worth a read: https://blogs.scientificamerican.com/cross-check/are-war-crimes-caused-by-bad-apples-or-bad-barrels/

コメント


bottom of page