Back to Top

ASK OCE — August 17, 2006 — Vol. 1, Issue 12

The thin line between mission success and failure holds important lessons for NASA, according to Georgetown University Professor Cathy Tinsley.

Tinsley and her colleague Robin Dillon-Merrill have been studying perceptions of so-called “near-misses” — events that have positive outcomes but could have resulted in failure if not for luck — and they have found that where highly complex systems are concerned, a glass half-full is not the same as a glass half-empty.

“A near-miss can be evidence of two things. It can be evidence of a system’s resiliency, or it can be evidence of that system’s vulnerability,” Tinsley said.

So when a near-miss occurs, how do people react?

“What we find is that overwhelmingly people…really discount the luck factor,” Tinsley said. “We tend to attribute successes to our own skill rather than good luck.”

The human tendency to downplay the role of luck is nothing new. Social scientists have understood the fundamental attribution error — ascribing successes to skill and failures to bad luck — since the 1960s. Tinsley and Dillon-Merrill have taken this another step further and found evidence of a near-miss bias: people who experience a near-miss actually become more comfortable with risky decision-making and engage in progressively risky behavior.

This increased tolerance for risk, which is often referred to as the “normalization of deviance,” was the basis for Tinsley and Dillon-Merrill’s interest in this research.

“As we were discussing the CAIB (Columbia Accident Investigation Board report), Robin and I realized that near-misses might be the mechanism by which the normalization of deviance occurs,” Tinsley said. “When I read the CAIB, what struck me the most was they had two main kinds of factors. One set of factors was all the technical things that went wrong. And then the other half was all the cultural and political stuff that went wrong. What I found missing was the cognitive explanation for what was going on. We were missing the intra-individual explanation, if you will.”

Tinsley believes that improved understanding of the near-miss bias will play an important part in overcoming its dangers. “If we can raise people’s awareness of how the biases are affecting peoples decision-making, then I think it’s possible to really improve decision-making at a very low cost.”

Tinsley and Dillon-Merrill began their near-miss research in 2004 with a seed grant from the Center for Project Management (CPMR), a partnership between NASA’s Academy of Program/Project & Engineering Leadership, and the Universities Space Research Association (USRA). In July 2006, they received a second CPMR grant to conduct another round of research. The National Science Foundation is also supporting their work.

Tinsley and Dillon-Merrill are glad to discuss their findings further with any interested NASA managers. Contact Professor Cathy Tinsley.

Read Tinsley and Dillon-Merrill’s CPMR presentation.

In This Issue

Message from the Chief Engineer

Leadership Corner: Why Should Anyone Be Led by You?

This Week in NASA History: Voyagers 1 and 2 Embark on Planetary Grand Tour

The SEED Program: Systems Engineer Development at GSFC

Understanding Near-Misses at NASA

Manned or Unmanned?: That Was the Question for STS-1 Project Managers

India/U.S. Collaboration: Unmanned Lunar Exploration Mission

Archimedes Archive: The Electrical Power System of the Panama Canal

About the Author

Share With Your Colleagues