Kindle | Hardcover | Audiobook

 

Black Box Thinking is the willingness and tenacity to investigate the lessons that often exist when we fail, but which we rarely exploit. It’s about creating systems and cultures that enable organizations to learn from errors, rather than being threatened by them.

 

In aviation, the black box records the flight so that in event of a disaster, we can thoroughly examine what went wrong.

They then implement changes and take precautionary action against future accidents. This is why accidents in the air have dramatically gone down over the last decades.

 

Everything we know in the aviation, every rule in the rule book, every procedure we have, we know because someone somewhere died.

We’ve purchased lessons literally bought with blood. We have to preserve institutional knowledge and pass on to generations. We cannot have a moral failure to forget these lessons after we learn them.

 

Reason is not usually laziness or unwillingness. The reason is more often the necessary knowledge is not being translated into a simple usable systematic form.

If the only thing people did in aviation was to issue page-long bulletins, it’d be like subjecting pilots to study almost 7,000 medical journal articles per year. Which is why they distill the information into its practical essence.

 

On the contrary, healthcare industry keeps making mistakes consistently with tragic consequences.

Doctors usually spawn cover ups with generic messages like “Something happened…”.  They have unconsciously enforced a culture where mistakes are seen as weaknesses that cannot be tolerated.

 

When doctors make a mistake, it results in someone else’s death. When pilots make a mistake, it results in his own death. That is why pilots are better motivated than doctors to reduce errors. But this analysis misses the crucial point.

Remember pilots died in large numbers in the early days of aviation. This was not because they lacked the incentive to live, but because the system had so many flaws. Failure is inevitable in the complex world. This is precisely why learning from mistakes is so crucial.

But in healthcare, doctors aren’t supposed to make mistakes. The system is setup to ignore and deny rather than investigate and learn.

 

Society as a whole has a deeply contradictory attitude to failure.

We’re quick to blame others for their mistakes than we’re so keen to conceal our own. The result is simple; it obliterates openness and spawns cover-ups. It destroys vital information we need in order to learn.

 

When our deeply held beliefs are challenged by evidence we’re more likely to re-frame the evidence than to alter our beliefs.

The reason is explained by cognitive dissonance. We feel discomfort when our beliefs run counter to our behaviors OR when new evidence is presented to us.

 

Mistakes stimulate growth only when we see them and use the feedback to improve.

A golfer gradually improves their game with trial-and-error practice. Imagine they practiced in the dark. They could practice for years, or a lifetime and they’d never know where the ball landed. So turn your lights on and look for ‘error signal’.

 

A cornerstone to success is a progressive attitude to failure.

Traditionally we punish errors. Punitive actions lead to blaming and covering up valuable information that could be used to learn and generate solutions that would prevent more mistakes. As a result, we end up practicing golf in the dark.

 

Closed loop is where failure doesn’t lead to progress because information on errors is misinterpreted or ignored altogether.

Open loop leads to progress because feedback is shared generously and rigorously acted upon.

 

Attention as it turns out is scarce.

If you focus on one, you’ll lose awareness of other things. Investigators realized crew members were losing their perception of time during crisis situations.

 

The problem is twofold, not just cognitive bias also fixed mindset.

Fixed mindsets ignore mistakes because they feel threatened – mistakes are a sign they’re inferior and always will be. Growth mindset on the other hand are interested in their mistakes because they picture errors differently. They believe they can develop their abilities through hard work.

 

Ironically, the more famous the experts the less accurate their predictions tended to be.

Why is this? Cognitive dissonance is the answer. It’s those who are strongly associated with their predictions whose likelihood of egos are bound up with their expertise. They’re most likely to re frame their mistakes and least likely to learn from them.

The idea here is the learning advantage of adapting to mistake is outweighed by the reputational disadvantage of admitting to it. The problem again is not always the external incentive structure. It’s the internal one. It’s the sheer inability of us to admit our mistakes even when we’re incentivized to do so.

After all, you expect the higher you go up in the company the less you’ll see the effects of cognitive dissonance. Aren’t the people who get to the top of the company supposed to be rational, forensic and clearly sighted. Isn’t that supposed to be their defining characteristics? In fact, the opposite is the case.

 

Studies concluded doctors blocked mistakes from entering conscious thought, and narrow the definition of mistake so that they effectively disappear or seen as inconsequential.

Doctors didn’t want to admit mistakes to themselves. They’ve spent years to reach high standards of performance. Their self-esteem is bound up with their clinical competence. They came into medicine to reduce suffering, not to increase it. But now they confront with having killed the healthy people. Just think how they desperate they’d have been to re-frame the fatality.

Admitting to error becomes so threatening that in some cases, surgeons would rather risk killing a patient than admit they might be wrong. Doctors hide their mistakes from patients, other doctors and even from themselves.

 

Doctors didn’t want to admit mistakes to themselves. They’ve spent years to reach high standards of performance. Their self-esteem is bound up with their clinical competence. They came into medicine to reduce suffering, not to increase it. But now they confront with having killed the healthy people. Just think how they desperate they’d have been to re-frame the fatality.

Admitting to error becomes so threatening that in some cases, surgeons would rather risk killing a patient than admit they might be wrong. Doctors hide their mistakes from patients, other doctors and even from themselves.

 

Doctors trusted in the power of bloodletting.

When a patient died, they believed because he was so ill that no even bloodletting could save him. But when they lived, that confirmed bloodletting saved him. Think of how many success stories circulated in the medieval world.

What the patients and doctors didn’t see is what happens if the procedure wasn’t performed?

Sure we can speculate what would happen and we can make decent guesses. But we don’t really know. But let’s say patients are randomly divided into two groups. Providing the sample size is big enough, the two groups  are likely to be similar. The only difference between the two groups is one gets the treatment the other doesn’t. Those don’t receive the treatment are called control group.

This is known as randomized control trial (RTC). Many of the patients that received bloodletting recovered. It looks successful. But many more in the control group  recovered. The reason is simple the body has its own powers of recuperation. By comparing  two groups, it’s possible to say that bloodletting on average kills them.

Same way, take the example of website redesign. The problem is whether if the change in design has increased sales or it was increased by something else. Suppose you randomly direct users to the new or the old design, you can then measure if they buy more with the old or the new design. This will filer out all the other variables such as interest rates, weather, competition.

 

We try to make the memory fit with what we know rather than what we saw.

It turns out the testaments of the witnesses are not always true despite them telling the truth.

  • They did remember seeing him there.
  • But they didn’t actually see him there.

These are two quite different things.

 

Ready, Fire and Aim

You construct a perfect rifle, you create a model of how the bullet will be affected by wind and gravity, you do your best to get the strategy just right. Then you calibrate the rifle, pull the trigger and watch as the bullet sails towards the target.

This approach is flawed for two reasons. Firstly the real world contains greater complexity than just wind and gravity. There’re endless variables and inter-dependencies. Secondly by the time you pull the trigger, the target would have been moved. So, Ready, FIRE and then AIM.

 

Debate or even criticism appears to simulate more creative ideas.

Cultures that permit or even encourage such expressions of differing viewpoints will stimulate more innovation.

 

Creative epiphanies happen in one of two circumstances.

  1. When we’re switching off, taking showers, walking, daydreaming, sipping a cold beer…
  2. When we’re being contradicted by others (some intentionally design exercises that contradict with the reality)

 

When you file a patent somebody is almost always there before you.

They may had a eureka moment before you. They may even had a early prototype. But none of them had make it work. Mine is statistically different. That’s my decisive advantage. – James Dyson (Inventor of bag-less vacuum cleaner)

 

Blaming increases mistakes and impedes learning.

The problem is often not a lack of focus, rather a consequence of complexity. Increasing punishment, in this context, doesn’t reduce mistakes. It reduces openness. It drives the mistakes underground. Remember the story of Libyan Arab Airlines Flight 114.

 

Last but not least, remember success is just the tip of the iceberg.

Beneath the surface of success, outside our view and often outside our awareness is the mountain of necessary failures. So fail early, fail fast and succeed!

Kindle | Hardcover | Audiobook


If you got something from black box thinking summary, please check out my other self-development summaries here.


Kyaw Wai Yan Tun

Hi, I'm Wai Yan. I love designing visuals and writing insightful articles online. I see it as my way of making the world a more beautiful and insightful place.