In 2006, astronaut Michael Mullane courageously landed on a strange planet called Comedy Central, in the late-night time zone known as The Daily Show with Jon Stewart.
Stewart interviewed Mullane, drawing from him the fact that a skyward traveler must travel 50 miles beyond our planet to be called an astronaut, and that no matter how many years of training one has in simulators (Mullane had 12 years), nothing can prepare anyone for the rocketing through Earth’s atmosphere.
“It really rattles you good,” Mullane told Stewart. “Obviously, the fear factor is absent in the simulators. You don’t worry about a simulator blowing up… On a shuttle, you have that in the back of your mind… As scared as I was, I knew it was not something I could turn away from.”
“Once you do it,” asked Stewart, “does it sate that urge? Are you now a neutered cat? Now that you’ve done it, can you just sit on the couch and just purr?”
Mullane conveyed that after each of his three trips on space shuttles, his urge for riding rockets was never sated.
Since this interview took place three years after the Columbia tragedy, Stewart and Mullane somberly spoke of falling foam. “On my first launch,” Mullane shared, “the pilot and commander were talking about seeing this foam come off the gas tank and striking the windows… in the cockpit, we all just did not think about it… something that light could not possibly damage the vehicle, so we kinda dismissed it… on Columbia, it hit in the right place and caused a catastrophe.”
Professors, practitioners, and many others have tried to map where information becomes knowledge. There must be a line somewhere. At NASA, we have a chief information officer and a chief knowledge officer. Although the titles seem synonymous to anyone scanning through the thesaurus, the job descriptions are drastically different. A quick and dirty way of differentiating between information and knowledge is to consider what critical knowledge must be. Critical knowledge keeps people out of what a hospital calls critical condition, or worse. Knowledge becomes critical when there is a possibility that when information is ignored, or when the undesired is made commonplace, it could lead to loss of property or life.
This month, in a New York Times Retro Report on the Challenger tragedy, Diane Vaughan, author of The Challenger Launch Decision, explained the inertia of the normalization of deviance, that is, “the falling back on routine under uncertain circumstances.” Another way to understand this organizational phenomenon is that the normalization of deviance occurs when rules of what has been found appropriate or safe change to ones which are less appropriate and less safe.
June is National Safety Month, and the Office of the Chief Knowledge Officer (OCKO) has begun a Critical Knowledge Audit of its Knowledge Map. We cannot be too diligent in discovering critical knowledge at NASA, and the OCKO will continue to incorporate new repositories of critical knowledge and recruit new members to its Knowledge Network throughout the audit process.
Less than a decade ago, Jon Stewart thanked Michael Mullane for his service with Shuttle program. “I’m happy that there are people out there brave enough to do it,” said Stewart before a comedic pause, “because I am not one of them.” Although it is impossible to guarantee a program free of fear or the possibility of tragedy, we as an organization can work stalwartly towards a culture of open communication, safety assurance, and sharing knowledge.