There’s a right way and a wrong way to fail
The stigma that surrounds failure needs to go. The surest way to avoid future failure is to embrace and learn from past failures
Ihave been thinking about errors, mistakes and failures ever since I traded my first stock decades ago. Good traders expect to be wrong, but that attitude is surprisingly rare in business. That is a shame, because having a healthy outlook on failure would benefit corporations, governments—just about everyone.
Let’s consider how we can better incorporate data into our processes, open versus closed approaches, and how we can learn to fail better.
If data is involved, then survivorship bias is not far behind. My favourite example involves Abraham Wald, a mathematician at Columbia University. Wald was a member of war department’s statistical research group during World War II. In How Not To Be Wrong: The Power Of Mathematical Thinking, Jordan Ellenberg describes how Wald addressed the challenge of armouring bombers so they could survive the fearsome attacks of fighter planes and anti-aircraft fire. The Center for Naval Analyses had performed a study showing the damage patterns of returning aircraft. Its recommendation was to add armour to those areas that showed the most damage: on the plane’s wings, fuselage and tail. Wald rejected that, noting if a plane could return with its wings shot up, that was not where armour was needed. Instead, he advised considering the larger data set of all planes, especially the ones that did not return. “The armour doesn’t go where the bullet holes are. It goes where the bullet holes aren’t,” he explained. “On the engines.”
High stakes make aviation an excellent subject for the study of failure. In other fields, errors may be subtle, and the results not recognized for years. When there is a flying failure, planes fall out of the sky, and footage of the wreckage is on the news.
Matthew Syed points this out in Black Box Thinking: Why Most People Never Learn From Their Mistakes (But Some Do). Aviation is an open, data-rich system, with statistics going back a century: In 1912, the US army had 14 pilots, and even before the war, more than half (eight) would die in crashes. The army set up an aviation school, to teach pilots how to fly more successfully. Unfortunately, the school had a 25% mortality rate.
Fast-forward a century. Syed observed that in 2013, there were 36.4 million commercial flights worldwide carrying three billion passengers. That year, there were only 210 commercial aviation fatalities. For some context, one million flights resulted in 0.41 accidents. An average of 2.4 million flights were needed for a single accident. Last year (2017), zero commercial airline passengers died. That is an astounding improvement over the course of a century.
How did the industry achieve this? By being self-critical and learning from accidents. Every accident, each crash (or near miss) gets studied extensively. The Federal Aviation Administration requires all large commercial aircraft to have a cockpit voice recorder and a flight data recorder to create a comprehensive and objective data set to allow for the full study of failure. Even the famed black boxes themselves are subject to exhaustive review and improvement. Today, these boxes are orange—making them much easier to spot in difficult terrain or underwater—and have submersible locator beacons to aid in their detection and retrieval from the ocean. It’s the perfect metaphor for how self-critical the industry is about safety.
Compare this with a closed system, like healthcare and hospitals. That industry has a very different approach, with vastly inferior results.
How different? Syed notes the remarkable contrast between air travel and preventable medical errors, which might result in as many as a half-million deaths in the US at a cost estimated at $17 billion a year. After heart disease and cancer, medical errors are the number 3 cause of death in America. Peter Pronovost, clinician at John Hopkins Medical School, wondered how we would respond if each day two 747 jumbo jets fell out of the sky killing roughly 900 people. That’s how many people die daily from medical errors.
Why is healthcare so different from aviation? First, there is little publicly available data and no sort of standardized review process when errors occur. Whatever self-examination takes place is sealed and not readily available for public scrutiny. There is an attitude among some that doctors are infallible saviours, creating a reluctance to admit error. Insurance costs, litigation and protecting reputations reduce the desire for a public accounting. In short, healthcare is everything that aviation is not.
Finance straddles the two approaches. There obviously is a great deal of data, but it isn’t the most open of systems. Security and Exchange Commission rules mandate disclosures by mutual funds, but require much less from hedge funds, venture capital, private equity, brokers and registered investment advisers.
When Bear Stearns collapsed in March 2008, it wasn’t merely a harbinger of the coming financial crisis, it was a reminder that no company was immune from existential failure. Public companies are reluctant, and often strongly resist, attempts to document and openly assess their failures. Perhaps they are not the ideal model to look to when thinking about failure.
Silicon Valley, technology and the venture-capital business model do a better job. Entrepreneurs and venture funders alike wear their failures like a badge of honour. Many venture capitalists even post their biggest misses on their websites. They recognize their model is to make a lot of losing bets in pursuit of finding the next big winner. Equity investors don’t have quite the same model, but they would benefit from a similar approach to recognizing their own limitations.
The stigma that surrounds failure needs to go. The surest way to avoid future failure is to embrace and learn from past failures. Bloomberg View
Barry Ritholtz is a Bloomberg View columnist.
Comments are welcome at firstname.lastname@example.org