October 29, 2010 Vol. 3, Issue 10
A twelve-year-old accident serves as a constant reminder that “there be dragons” in NASA projects.
In 1998, a commercial jet approached the research runway at the Wallops Flight Facility to perform an engine water ingestion test. This test was supposed to be routine—just like the many that had come before it. All jet-powered aircraft designs flown in the United States are required to pass it. However, this particular test, the eleventh run in a planned series of twenty, did not end like its predecessors.
The plane approached the flat runway, which had a pool of water strategically placed for the plane to land in. Manned, high-speed cameras surrounded the area to capture the imagery for later analysis. As the plane touched down, a crosswind caused the plane to swerve and flip over—just missing a cameraman. The aircraft burst into flames, destroying a nearby support vehicle. Miraculously, no one was hurt.
Later review showed that the test that day was not, in fact, business as usual. The operations team had made a series of small changes to the planned procedures. The puddle’s position on the runway moved several times. The cameramen were repositioned for a better shot. No one openly questioned these seemingly harmless changes for what was perceived as a routine operation.
“To this day, [that incident] marks my standard of worry,” said Jay Pittman, Chief of the Range and Mission Management Office at Wallops. For nearly a decade, he has been responsible for granting flight permission at Wallops. Worrying about risk is his job, and he takes great care to remain cognizant of it.
“There comes a comfort level with things that you’ve done before, and that can be a dangerous thing,” said Pittman, who was not part of the team involved in the incident that day. “I don’t believe that there was a specific instance of intentional negligence on the part of the team that oversaw what ended up being a disastrous event, but there was a slow and silent accumulation of a number of things.” What seemed like very small additional requirements and unreviewed changes added up to a dramatic change that brought new risks, explained Pittman.
As a leader, Pittman wanted to be able to convey to his teams the seriousness and helplessness that emerges when conducting risky missions—even the ones that seem routine. To him, risk looks like a dragon. “The dragon for me is this notion of quiet risk that accumulates into a critical mass and then explodes in your face.”
This metaphor of a dragon comes from the story The Hobbit. Pittman recalled the fear of the residents who live below Smaug, the dragon, who inhabits the Lonely Mountain above. When living in such an area, argues Pittman, how can you not factor in the risk a dragon imposes on your daily life?
For Pittman, the anatomy of the dragon includes a number of elements. Number one, he said, is complexity. “Don’t tell me that you’ve done [something] before. Everything we do has incredible complexity, and it’s ludicrous for us to say that it’s not.”
Schedule and cost pressure are also omnipresent. Congress, NASA leadership, the mission directorates, and the public all want to see a final product, a mission. The pressure to make everyone happy is immense.
There is also the feeling of being 100 percent a part of a team, which is good, said Pittman, but there can also be a downside to this. “That means there’s pressure not to be the stick in the mud,” said Pittman. “You don’t want to be the person who says, ‘I’m not really comfortable. I’m not sure this will work. I’m not really sure this is the same as last time.'”
“Nothing is the same as last time because today is a different day,” said Pittman. He looks for the uniqueness in each of his projects, particularly the ones that seem routine. It’s too easy to be lulled by paperwork and checklists. During reviews, Pittman makes sure that he invites people who have never seen the project to every mission review panel. “It’s the fresh eyes that keep us from doing truly stupid things that you could just drift into little by little.”
He also emphasizes learning lessons rather than listing them. He thinks of lessons learned as actionable tasks that act as liens against projects. “If we haven’t turned it into something real, then that lesson learned from some mission long ago is a lien against future missions.” This generates what he calls “reasoned assessments” of why it’s OK to keep going in spite of the lien. They keep the team ready in spite of a challenge, he explained. “It’s that reasoned assessment that goes missing when we become comfortable.”
Pittman offered his final thoughts on risk.
“Sometimes the leadership, managers like me, are too far removed from what is really going on. Sometimes everybody knows the real story except for the leader. It’s the job of a leader is to find a way to make public what ‘everybody knows.'” He offered a few examples of those types of things:
Everybody knows…
- What almost hurt someone last time.
- Who doesn’t get along and how that affects communication.
- How stuff really happens and what rules to follow.
- What really went wrong.
- What almost went wrong.
- Lessons learned equals lessons listed.
- Places that don’t get seen during audits.
- The checklist doesn’t matter, the checkers do.
- Organizations don’t fix problems, people do.
- Which managers you can go to…and which ones you can’t.
Despite the risks that come with NASA missions, the NASA workforce certainly has something to be proud of, added Pittman. “We do things that normal people would never think of doing,” he said. Things like putting a satellite in space, going to the moon, going to Mars, measuring the temperature of the universe, or quantifying the energy of a raindrop falling in the ocean.
At the end of the day, however, NASA teams are made up of people. “Sometimes people don’t do what you expect,” said Pittman. “We’re capable of leaps of creativity and insight that nothing else can do, but sometimes you have a bad day…The fact that we are human means that we have strengths and weaknesses. It’s our job as responsible leaders to maximize the strengths of our people and our teams and to enable them to see clearly the risks involved in our missions in spite of the fact that we are human.”