HomeBlog What Accidents Rooted in History Can Teach B-to-B

What Accidents Rooted in History Can Teach B-to-B

February 18, 2016 | By Kerry Cunningham

  • The power of data and analytics is bringing more and more information to our fingertips in b-to-b every day
  • But that information will do us no good if we are not prepared to comprehend it
  • Beyond technical skills, we must foster cultures in which the facts are the facts, even when they portray unpleasant truths

Most of us who were alive at the time remember the day in January 1986 when news came that the space shuttle Challenger had exploded just after liftoff, killing all seven members of its crew, including a schoolteacher from my small state, New Hampshire. The news was shocking – because of the tragic loss of life, and also for the failure it seemed to indicate in what is perhaps the government’s most prestigious government organization, NASA.

In the months that followed, the Rogers Commission, and its most notable member, physicist and Nobel Prize winner Richard Feynman, investigated the cause of the accident. There were, as scientists like to say, both immediate and proximate, or more distantly related, causes for the tragedy.

The immediate cause, as Professor Feynman famously revealed in an experiment conducted during a live Rogers Commission hearing, was the failure of an O-ring to expand as expected. In essence, an O-ring is a rubber gasket that sits in the seam between two parts of the rocket’s boosters. By design, the O-ring was supposed to expand as the rocket ship took off, filling space between segments of the rocket and preventing gas from leaking between them. In the event, the O-ring did not expand, and gas leaked, caught fire and ignited the catastrophic explosion.

Why did the O-ring fail? This leads us to the proximal causes. Very simply, the rubber of the O-ring was too cold at the time of the flight to expand properly. The morning of the flight was a particularly chilly one for South Florida, exactly 32 degrees Fahrenheit. The cold O-ring simply failed to expand.

But knowing that begs further questions concerning proximate causes. Didn’t NASA know the rubber wouldn’t work properly at that temperature? And didn’t NASA, furthermore, know the temperature on the morning of the flight? Wasn’t there someone whose job it was to know both and to abort the mission, or delay it until the temperature moved into a safer range?

The answer to all of these questions is, of course, yes. As Feynman concluded in a codicil to the official Rogers Commission report, he discovered that managers at NASA were at fault in two ways. First, they failed to understand relatively simple statistical principles, leading them to dramatically underestimate the likelihood of failure. NASA management estimated likelihood of failure to be 1 in 100,000 when it was really 1 in 100. Second, Feynman described a culture within NASA managerial ranks of dismissing or reinterpreting the advice of qualified engineers in favor of more politically palatable ideas. When engineers told management that there were real risks, managers ignored, squelched and reinterpreted the facts in order to avoid delays and bad publicity. Feynman argued in his memoir What Do You Care What Other People Think?, that this culture pervaded NASA and made tragedies such as the Challenger explosion more likely. Hence, it was an accident rooted in the history at NASA.

There are parallels in b-to-b, where the power of data and analytics is bringing more and more information to our fingertips every day. But that information will do us no good if we are not prepared to comprehend it, and more particularly unless we are prepared to accept the facts, even when they are not favorable, and to act on them even when doing so is not convenient.

As advanced analytics and its practitioners become more common in b-to-b, it is increasingly the case that the skills and knowledge of these analysts range beyond the skills of their b-to-b colleagues. We lack the technical skills required to generate the information that these specialists routinely generate. However, if we are to avoid egregious errors, we must do two things. First, we must understand how to properly interpret the information provided by the analysts. We don’t need to know how to conduct statistical analyses, but we must understand what the analyses are telling us. Second, we must foster cultures in which the facts are the facts, even when they portray unpleasant truths.

Join me on March 1, 2016, for a webinar about analytics proficiency in b-to-b. Click here to register today.

Kerry Cunningham

Kerry Cunningham is a Senior Research Director of Demand Creation Strategies at SiriusDecisions. Kerry has more than 20 years of experience in b-to-b demand creation and management, spanning a broad array of industries and markets. Follow Kerry on Twitter @KerrySirius.

Back to top