When you think about the word "failure," what comes to mind? Embarrassment? Shame? Catastrophe? It’s unlikely the term conjures up any positive connotations.
And yet, Matthew Syed’s 2015 book Black Box Thinking asks us to consider failure as a starting point for making things better.
Black box thinking is a mindset named after the black box recorder fitted to every aircraft. Just as the black box in an aeroplane is there to be salvaged after a crash – to review flight data, listen back to the pilot’s conversations and understand mistakes that led to an accident – so too is black box thinking focused on analysing mistakes and uncovering the blind spots that trigger missteps in both our personal and professional lives.
With stories from sport (unsurprising, given Syed’s background as a table tennis champion), aviation and healthcare, Black Box Thinking provides a fresh take on some uncomfortable truths. So, fasten your seatbelts and prepare for take-off…
What is black box thinking?
Black box thinking begins by focusing on failure. Not in order to apportion blame or ridicule, but to approach errors with an open mind; a determination to get to the root cause of a problem, and to fix it.
So far, so obvious, you might think. But often failure can be clouded by ego ("I’m too important to be wrong"), denial ("it was someone else’s fault") and distraction ("Look! The real problem is over there..."). The easy option can be to accept the first explanation that comes along and take on trust that it won’t happen again.
In our personal lives, this might not necessarily be a huge deal: if your partner fails to put the bins out and blames the weather, a forensic interrogation might not be needed. But what if the failure occurs during a surgical operation?
The beauty of the black box as a metaphor is that it’s about a determination to discover what really happened in the lead up to failure, rather than what people want you to believe. Having this objective clarity then creates the conditions for understanding and progress. To this end, black box thinking cuts through the emotion and strives for a practical fix.
Now as much as I love a good metaphor, let’s start by looking at the home of the actual black box and probably the gold standard of learning from failure: the aviation industry...
Supersonic safety
The pragmatic attitude to failure in the aviation industry is epitomised by the black box – a physical reminder of the imperative of learning the truth behind mistakes.
And while we’re talking truths, did you know that black boxes aren’t actually black? They’re fluorescent orange, which makes them easier to find in a wreckage. A small detail, perhaps, but one that epitomises the mindset of always prioritising practicality and safety.
Syed offers the example of the US Army Boeing B17 which, in the 1940s, kept crashing inexplicably. A Yale psychologist named Alphonse Chapanis was brought in to investigate the failure and found that the switch linked to the wheels and the one linked to the landing flaps were a) alongside each other on the dashboard and b) identical in size and shape. So, in difficult weather conditions, approaching a tricky landing, pilots pressed the wrong switch and crashed onto the runway with catastrophic results.
Chapanis set about changing the shape of the levers to resemble the equipment they were linked to:
A small rubber wheel was attached to the landing-gear switch and a small flap shape to the flaps control. The buttons now had an intuitive meaning, easily identified under pressure. What happened? Accidents of this kind disappeared overnight.
An incredible result, right? But it’s worth pausing to consider what might’ve happened without a black box mindset…
US Army chiefs could have rejected the idea of involving an outsider. They could have assumed that their experience and rank automatically gave them sufficient expertise. They could have dismissed Chapanis’ findings, or buried his insights somewhere in a report without ever making a change to the flight console. History is littered with behaviour like this.
But thankfully aviation has continued with a successful tradition of black box thinking and sensible fixes:
In 2013, there were 36.4 million commercial flights worldwide carrying more than a billion passengers, according to the International Air Transport Association. Only 210 people died. For every one million flights on Western-built jets there were 0.41 accidents – a rate of one accident per 2.4 million flights.
Where else could you hope to have an accident rate as low as 1 in 2.4 million? Surely all sectors are desperate to embrace the black box mindset and mimic such an outstanding safety record, aren’t they?
"Complications"
But in the healthcare sector the stats are far from, er, healthy.
Syed references a report by the UK National Audit Office from 2005 which estimated that up to 34,000 people per year die due to human error. And the statistics are no less alarming in the US:
In 2013 a study published in the Journal of Patient Safety put the number of premature deaths associated with preventable harm at more than 400,000 per year. (Categories of avoidable harm include misdiagnosis, dispensing the wrong drugs, injuring the patient during surgery, operating on the wrong part of the body, improper transfusions, falls, burns, pressure ulcers, and postoperative complications).
This, for Syed, represents a culture trapped in a "fixed" mindset, where errors are often met with blame and denial. He points to “a tendency for evasion” amongst some medical professionals. Failure is often cloaked in euphemisms like “technical error,” “complication,” or “unanticipated outcome”.
These terms are too vague to lead to any meaningful improvement of, say, safety protocols. And yet, repeated with enough confidence and authority, they keep scrutiny at arm's length, leaving the real failings largely unexamined.
Failure Week
So, will we ever get to a stage where black box thinking is the norm?
Learning the lessons from the aviation industry is a great place to start. But failure is such a thorny web of psychology, history, language, politics... you name it! It can't be fixed by a training course or a TED Talk. If we lifted the lid on the human black box, we’d likely conclude that developing a healthy relationship with failure needs to be addressed much earlier in life.
After all, isn’t it the case that most young children are fine with failure? Making a right old mess or using building blocks until a tower teeters and falls is hilarious for a toddler. In fact, experimenting and fixing mistakes afterwards can be rewarding and fun. It’s all in the spirit of play.
But as innocence gives way to experience, we become aware of factors like peer pressure and standardised testing. Fear of failure – as something that needs to be avoided or denied – rears its head.
A high school in Wimbledon, London, tried tackling this head-on as Headmistress Heather Hanbury felt her students “were performing well in exams, but many were struggling with non-academic challenges, and not reaching their creative potential, particularly outside the classroom.” Her winning solution? Initiating "Failure Week," where failure was celebrated through workshops and assemblies. She even highlighted failure role models:
She showed YouTube clips of famous people practising: i.e. learning from their own mistakes. She told students about the journeys taken by the likes of David Beckham and James Dyson so they could have a more authentic understanding of how success really happens.
Pupils were encouraged to take risks and then learn about why something didn’t work. Their focus was on “failing well… on being good at failure” – in other words, becoming black box thinkers.
Intelligent failures
Harvard academic Amy Edmondson suggests there are three types of failure: Simple failures (which we might categorise as "mistakes"), complex failures ("accidents") and intelligent failures ("discoveries").
In an interview with Big Think, Edmondson claimed:
If you want to have more intelligent failures in your life, in your work, essentially, you have to think like a scientist. They have trained themselves to not just tolerate failure but to really welcome the lessons that each failure brings.
So maybe the more we can embrace the concept of intelligent failures – and the opportunities to genuinely learn from our mistakes – the less likely we are to feel ashamed and deny or cover-up the things that we did wrong.
To err is human, after all, and behind every stunning invention or discovery – like the device you’re reading this on now, the latest surgical cure or aircraft – lies a thousand intelligent, human failures.
Let’s make that part of our thinking.
Comments