Mathew Syed – Black box thinking

We will see that blame is, in many respects, a subversion of the narrative fallacy: an oversimplification driven by biases in the human brain.

There’s no point in failing and then dealing with it by pretending it didn’t happen, or blaming someone else. That would be a wasted opportunity to learn more about yourself and perhaps to identify gaps in your skills, experiences or qualifications.

Almost every society studied by historians has had its own ideas about the way the world works, often in the form of myths, religions and superstitions. Primitive societies usually viewed these ideas as sacrosanct and often punished those who disagreed with death. Those in power didn’t want to be confronted with any evidence that they might be wrong. As the philosopher Bryan Magee put it: ‘The truth is to be kept inviolate and handed on unsullied from generation to generation. For this purpose, institutions develop –mysteries, priesthoods, and at an advanced stage, schools.’ 1 Schools of this kind never admitted to new ideas and expelled anyone who attempted to change the doctrine. 2 But at some point in human history this changed. Criticism was tolerated and even encouraged. According to the philosopher Karl Popper, this first occurred in the days of the Ancient Greeks, but the precise historical claim is less important than what it meant in practice. The change ended the dogmatic tradition. It was, he says, the most important moment in intellectual progress since the discovery of language.

It turns out that many of the errors committed in hospitals (and in other areas of life) have particular trajectories, subtle but predictable patterns: what accident investigators call ‘signatures’. With open reporting and honest evaluation, these errors could be spotted and reforms put in place to stop them from happening again, as happens in aviation. But, all too often, they aren’t. It sounds simple, doesn’t it? Learning from failure has the status of a cliché. But it turns out that, for reasons both prosaic and profound, a failure to learn from mistakes has been one of the single greatest obstacles to human progress.

Studies have shown that we are often so worried about failure that we create vague goals, so that nobody can point the finger when we don’t achieve them. We come up with face-saving excuses, even before we have attempted anything. We cover up mistakes, not only to protect ourselves from others, but to protect us from ourselves. Experiments have demonstrated that we all have a sophisticated ability to delete failures from memory, like editors cutting gaffes from a film reel

a closed loop is where failure doesn’t lead to progress because information on errors and weaknesses is misinterpreted or ignored; an open loop does lead to progress because the feedback is rationally acted upon).

This, then, is what we might call ‘black box thinking’. fn8 For organisations beyond aviation, it is not about creating a literal black box; rather, it is about the willingness and tenacity to investigate the lessons that often exist when we fail, but which we rarely exploit. It is about creating systems and cultures that enable organisations to learn from errors, rather than being threatened by them.

Science is not just about confirmation, it is also about falsification. Knowledge does not progress merely by gathering confirmatory data, but by looking for contradictory data.

It is by testing our ideas, subjecting them to failure, that we set the stage for growth.

Even the most beautifully constructed system will not work if professionals do not share the information that enables it to flourish.

We cannot learn if we close our eyes to inconvenient truths, but we will see that this is precisely what the human mind is wired up to do, often in astonishing ways.

‘Cognitive dissonance’ is the term Festinger coined to describe the inner tension we feel when, among other things, our beliefs are challenged by evidence. Most of us like to think of ourselves as rational and smart. We reckon we are pretty good at reaching sound judgements. We don’t like to think of ourselves as dupes. That is why when we mess up, particularly on big issues, our self-esteem is threatened. We feel uncomfortable, twitchy. In these circumstances we have two choices. The first is to accept that our original judgements may have been at fault. We question whether it was quite such a good idea to put our faith in a cult leader whose prophecies didn’t even materialise. We pause to reflect on whether the Iraq War was quite such a good idea given that Saddam didn’t pose the threat we imagined. The difficulty with this option is simple: it is threatening. It requires us to accept that we are not as smart as we like to think. It forces us to acknowledge that we can sometimes be wrong, even on issues on which we have staked a great deal. So, here’s the second option: denial. We reframe the evidence. We filter it, we spin it, or ignore it altogether. That way, we can carry on under the comforting assumption that we were right all along. We are bang on the money! We didn’t get duped! What evidence that we messed up?

Festinger’s great achievement was to show that cognitive dissonance is a deeply ingrained human trait. The more we have riding on our judgements, the more we are likely to manipulate any new evidence that calls them into question.

Psychologists often point out that self-justification is not entirely without benefits. It stops us agonising over every decision, questioning every judgement, staying awake at night wondering if getting married/ taking that job/ going on that course was the right thing to do. The problem, however, is when this morphs into mindless self-justification: when we spin automatically; when we reframe wantonly; when failure is so threatening we can no longer learn from it.

intelligent people are not immune from the effects of cognitive dissonance. This is important because we often suppose that bright people are the most likely to reach the soundest judgements. We associate intelligence, however defined, as the best way of reaching truth. In reality, however, intelligence is often deployed in the service of dissonance-reduction.

As the philosopher Karl Popper wrote: ‘For if we are uncritical we shall always find what we want: we shall look for, and find, confirmations, and we shall look away from, and not see, whatever might be dangerous to our pet theories. In this way it is only too easy to obtain . . . overwhelming evidence in favour of a theory which, if approached critically, would have been refuted.

Closed loops are often perpetuated by people covering up mistakes. They are also kept in place when people spin their mistakes, rather than confronting them head on.

Often, failure is clouded in ambiguity. What looks like success may really be failure and vice versa. And this, in turn, represents a serious obstacle to progress. After all, how can you learn from failure if you are not sure you have actually failed?

When we are presented with evidence that challenges our deeply held beliefs, we tend to reject the evidence or shoot the messenger rather than amend our beliefs.

Criticism surfaces problems. It brings difficulties to light. This forces us to think afresh. When our assumptions are violated we are nudged into a new relationship with reality.