Back to Top

Tap into the experiences of NASA’s technical workforce as they develop missions to explore distant worlds—from the Moon to Mars, from Titan to Psyche. Learn how they advance technology to make aviation on Earth faster, quieter and more fuel efficient. Each biweekly episode celebrates program and project managers, engineers, scientists and thought leaders working on multiple fronts to advance aeronautics and space exploration in a bold new era of discovery. New episodes are released bi-weekly on Wednesdays. 

NASA Goddard Space Flight Center Chief Knowledge Officer Ed Rogers discusses the complex human elements involved in disasters such as the Space Shuttle Columbia accident.

Rogers is a keynote speaker at Space Shuttle Columbia National Tour stops at NASA centers and provides insight from Columbia lessons learned that could help prevent a similar tragedy in the future. In this podcast episode, Rogers offers guidance for reflecting closely on the Columbia accident as NASA moves forward with Moon and Mars missions.

In this episode of Small Steps, Giant Leaps, you’ll learn about:

  • The importance of asking hard questions to reduce risk
  • Vital components of good decision-making processes
  • How accountability works in parallel with inclusiveness


Related Resources

National Tour Emphasizes Columbia Lessons Learned

Episode 7: Space Shuttle Columbia National Tour (Small Steps, Giant Leaps Podcast)

Columbia ‘Relaunch’ Aims to Inspire, Educate

Remembering Columbia

Technical Authority

Goddard OCKO – Pause and Learn

APPEL Case Study: Columbia’s Last Mission

APPEL Courses: Lessons Learned for Mission Success

Columbia Accident Investigation Board

NASA Engineering and Safety Center


Ed Rogers Credit: NASA

Ed Rogers
Credit: NASA

Ed Rogers is the Chief Knowledge Officer (CKO) at NASA’s Goddard Space Flight Center. Rogers joined NASA in May 2003 as the center’s Chief Knowledge Architect and became CKO in 2006. He has built a set of knowledge management practices that strategically support Goddard’s overall mission of designing, developing and flying space missions and continuously improving space communications. He introduced a Pause and Learn (PaL) process, which allows project teams to reflect on their experience and improve organizational learning, and developed an internal case study methodology that’s applied to create Goddard-specific case studies for use in training sessions and workshops. Rogers, a frequent guest lecturer in MBA classes and at federal agencies in the Washington, D.C. area, has served since 2009 as a visiting faculty member at the Indian School of Business in Hyderabad, India, where he teaches a popular course on managing complexity. Rogers received a bachelor’s in agronomy from Ohio State University, a master’s in international business from the University of South Carolina and a doctorate from Cornell’s School of Industrial and Labor Relations.


Ed Rogers: If we want to learn from something like the Columbia accident and not let our organization get locked up and become too risk-averse, that means we need to get really close to it and really understand exactly what happened and why.

Hindsight is anything but 20/20. Hindsight is only 20/20 in what actually happened. Yes, we know it broke, we know it didn’t work, we can see that now. But how it actually got there our hindsight is poor. It’s much more like 20/100.

Somebody, which would be managers—senior managers—need to be in charge of watching the decision-making process itself.

Deana Nunley (Host): You’re listening to Small Steps, Giant Leaps – a NASA APPEL Knowledge Services podcast featuring interviews and stories, tapping into project experiences in order to unravel lessons learned, identify best practices and discover novel ideas.

I’m Deana Nunley.

The Space Shuttle Columbia National Tour is designed to inspire and educate as lessons learned from the past are shared with NASA’s workforce. The tour includes an exhibit of Columbia artifacts and training from APPEL Knowledge Services, and will make stops at all 10 NASA centers across the U.S.

One of the keynote speakers for the traveling program is Goddard Space Flight Center Chief Knowledge Officer Ed Rogers, and he’s our guest today.

Ed, thank you for joining us on the podcast.

Rogers: You’re most welcome, Deana. It’s good to be here.

Host: It’s been almost 17 years since the Columbia accident. How would you describe the engagement and interest level of the NASA workforce when you share lessons learned from Columbia?

Rogers: It’s very interesting. There’s obviously a group of people who were here and can remember where they were sitting and what they were doing when they first saw, you know, the Columbia unfolding on the news. And there’s another group of people who were probably in kindergarten or elementary school and have grown up and heard about it but don’t really know about it. I find it very interesting—the responses. The people who were here at the time are just as interested and anxious to know really what happened. And, “Why didn’t we do something?” is probably the most common question I hear. Why didn’t we do something?

The younger people who don’t remember witnessing it, you know, the younger generation, they’re equally as interested and they’re equally as perplexed, but they don’t have the background of NASA to know that NASA doesn’t make mistakes and stuff. They’ve kind of grown up with the idea that NASA does make mistakes, and so they find it a bit challenging to believe, you know, how could people have done that? Wouldn’t they have figured it out? And they’re very open. I Find them very, very open to learning.

Actually, both groups are open to learning because they feel like the people who were here don’t feel like we really got to the bottom of it. And the people who it’s new to just find it confusing, like it doesn’t make sense. Why wouldn’t NASA rescue the astronauts? We saw Apollo 13, they went and rescued them. You know, why wouldn’t they—or did what they needed to do? You know, why wouldn’t they rescue them?

So, the interest is very, very high, and I think it’s indicative of the need we have to go forward with trying to teach these lessons.

Host: And how do people actually learn so that NASA can learn lessons better?

Rogers: Well, it’s interesting. My mother was an elementary teacher. I grew up in schools all my life. My father was also a college professor, and so he talked a lot about people learning and people not learning, of course. And, fundamentally, my simplest explanation to learning is it comes down to reflection. We learn when we reflect on experience because not all experience teaches us something.

We may change our behavior because of an experience, but we may not have reflected on it very well and actually learned a useful lesson. You know, may not have learned in the right direction. For example, you have a bad experience with something, say you have a bad dog experience as a young child, so you never have anything to do with dogs the rest of your life. Yes, you learned something, but it wasn’t a very useful, effective lesson. It was just a blanket fear that you used to avoid any contact in the future so that would never happen again, but you didn’t actually learn much about dogs, if you understand.

And so the reflection part is really critical, and unfortunately, without reflection we also tend to blame and complain a lot more because reflecting helps us see the part we either played or could’ve played in determining an outcome, but without reflecting we tend to fill in the blank with simple answers—oh, it was just a political thing or it was just a funding thing or that person just wanted that thing to happen. And so, because we are willing to accept a simplistic answer without reflection, we kind of short-circuit the ability for us to actually learn something from it.

And so, this was an interesting angle that I took when I came here to Goddard right around the time of Columbia, actually 16 years ago, and realized that a lot of the activities that were being put forward as KM or knowledge management activities had much more to do with storing and collecting things than people actually learning something. So, we took a decidedly different approach that was entirely focused around increasing learning and the ability of people to reflect. So, we created activities such as pause-and-learn and workshops and forums and case studies that generate reflection and learning on the decisions and the actions and things we’ve taken in addition to, of course, the lessons learned that come directly from mishaps.

Host: How can NASA technical workers learn from the close calls and mishaps of others without making their own mistake?

Rogers: Well, that’s a good question. It’s not clear people are very good at learning—people, meaning human beings in general—are very good at learning from others’ mistakes without doing it themselves. I mean children teach us this all the time, right? They see somebody poke a stick and do something bad and they’re like, “Let me try that. Oh, look, it poked me too.” And then we say, “Oh, well why didn’t you learn from watching your brother?”

“Well, I did. I learned that I can poke the stick, and then I learned it hurts me.” They sort of seem to learn it in sequence. One of the reasons is that we don’t understand what actually happened. Other people’s mistakes or mishaps or things like that that we see around us, whether they’re at NASA or outside, and there are plenty—chemical disasters, airline disasters, decision-making disasters, you know, that have been human errors or a combination of human and technical errors—we don’t actually understand how they got to the place they got to.

We see where they ended up. We see that’s a bad outcome and we try to find the simplest explanation that will tell us why or how they got there. They made a bad decision. The person was uninformed. They didn’t have knowledge at the time. And then we sort of move on. And so, it leaves us with the idea that if we have just the one thing that they didn’t have, then we wouldn’t do what they did and we wouldn’t end up where they ended up.

And that’s almost universally false. It’s almost much more often the case that we would end up almost in the exact same place that other people ended up because the decision process to get to these—I’m talking about major disasters that were many people and complex human elements involved—many people and many steps were involved in getting there and they’re much more complex than a simple mistake. I mean there are simple mistakes that get made but more often there are much more complex things.

And so, if we don’t understand how the organization got there, we’re actually pretty likely to repeat it. This is sort of the essence of Diane Vaughan’s comments about Columbia. And she had written the book on the Challenger launch decision, and when asked about Columbia, she said, “Well, it looks like there’s some similarities here.” What she’s referring to is the organizational processes and structures that allowed this kind of decision to happen, not that the accident was at all similar to Challenger other than it was also a space shuttle.

Clearly it was a different kind of accident technically. And NASA sort of faulted, or there were some criticisms saying, “Well, she didn’t understand the differences between Challenger and Columbia.” Actually, she understood the similarities. And so, if we don’t look at how the things actually happened, then we’re not going to learn the lessons we really need to learn, and part of that comes from a humility of asking, “Well, what could’ve gone wrong?” or “What did we need to understand about how we actually did something even if it went right?” And that would be in the category of learning about luck as well as close calls.

Host: So, as an agency and as individuals within NASA’s technical workforce, how do we learn from mistakes, disasters and accidents without becoming overly risk-averse?

Rogers: Well, that is a very common response. As I gave the example of the child who is bit by a dog, say, at an early age and the rest of their life never has anything to do with dogs, that would be a typical response when they don’t understand dogs or why that particular dog actually maybe had a bad interaction with them. And so, they sort of generalize to all dogs and behave similarly. That’s the same phenomena that happens when an organization or a group of people have a bad experience in some area. They don’t really understand all the nuances of it, so they just take a giant step back and say, “We’ll never do that type of thing again and we’ll never do it like that and we’ll stay far away and we’ll do extra precautions.”

And you see that kind of thing, which is kind of described as what you said, being overly risk-averse. The only solution really to that that I know of is to get close to it. So, if we want to learn from something like the Columbia accident and not let our organization get locked up and become too risk-averse, that means we need to get really close to it and really understand exactly what happened and why, including the role we played—we the human beings—played in that so that we can be confident that we’ve learned those lessons and are applying them as we go forward.

If we don’t have that confidence, then we will be tempted to step back and step further back because we don’t really understand what happened. And so, we’ll step back as far as we can and keep as much space between us and that bad thing as possible and that’s what overly risk-averse really looks like. It’s unnecessary space because we don’t understand what actually happened and are losing our confidence that we know certain things and know where the things are that we don’t know. And that takes retrospection, that takes a willingness to dig in, that takes some humility, because yes, we did play a role.

Host: Ed, you’ve spent a lot of time deconstructing the decision-making process that factored heavily into the Columbia tragedy. What are some of the key takeaways?

Rogers: Well, first of all, in a decision analysis, something like Columbia, I don’t find any personal individuals who were somehow wrong or somehow in and of themselves responsible for this bad decision. I would agree with Diane Vaughan’s description that the organization creates the vulnerability to these kinds of decisions being made, and if the organization doesn’t detect that vulnerability, then sooner or later it will catch up with them in a bad decision such as Challenger or Columbia.

So for example, one of the things that becomes apparent when you look at the Columbia decision process and during what I’m referring to as during the time post launch when they had information saying something hit the Orbiter, and they had the opportunity to decide whether they should get better imagery or determine whether the shuttle was damaged. And then, of course, once they made that determination then they could’ve moved into “what if” scenarios or what they might or could’ve done about it, including a rescue or some other attempt to remedy the situation. But what’s important to see is there was a sequence of events that happened in terms of the way the Mission Management Team was influenced in their decision in terms of who got to them first, whose opinion they heard first, which question they heard second, when they were questioned later about issues for imagery, and that sequence of events influenced the decision that they actually made.

Now the problem is no one was in charge of that sequence. And no one was paying attention to that sequence to say, “Maybe we should be careful and not let the sequence of events overly influence or dictate the type of decision we make because that’s not a good decision-making process. We should make the decision based on the evidence and the verification of it and the questions and the uncertainty and engineering input that we have, not based on a sequence of who came first and who said what first and how it was categorized.” So that’s one observation, and we can see that when we look through the process.

That’s not an unknown phenomenon. That’s known in decision-making. But the point is somebody, which would be managers—senior managers—need to be in charge of watching the decision-making process itself. The technical experts are looking and giving opinions about the technical content. Managers are weighing that content in their decision and weighing it in the risk. But they also need to look at, how is the decision process actually being made? How are we allowing this to go through the organization? Is someone following it or is it just finding its own way, which means there’s an element of randomness or luck as to whether the right decision will end up being made.

So that’s one aspect of it. The other aspects are we often use data in somewhat misleading ways. So for example, people say, “We only make decisions based on data and we need, you know, opinions backed with data.” But the problem is as human beings we’re very willing to accept our own opinion, the one that we held and the one that we actually favor, without much data. But someone who brings an alternate opinion or an alternate view, we require them to have a much higher level of data than we require of ourselves.

Now this is not to fault anybody. That’s a human nature, human tendency, and it’s just a personal bias. That’s fine. But not being aware of that, not understanding that allows us to fall into a trap of believing things that we don’t really have much data for. And we fell into this trap in the Columbia when it was widely believed and reiterated and used as supporting evidence—the belief that foam could not hurt the reinforced carbon-carbon panels, could not damage them. Even though it was a much larger strike than they had seen before, it wasn’t clear where it’d hit, there was a belief that foam in and of itself, because of density and velocity and just sort of general physics principles, couldn’t hurt RCC panels.

However, to say that it could have damaged RCC panels, you would’ve been required to bring data, and that’s where they got caught in the example I just explained. So, not being aware of that tendency allowed the vulnerability for a poor decision to be made, and that’s the kind of things that we need to get inside and learn from and be able to watch for in our decision processes because they’re very subtle. They don’t seem like bad things to do. They didn’t seem like wrong things to do to the Mission Management Team or the people working Columbia.

They seemed like good things to do, they seemed like logical things to do, but if someone isn’t looking at the process of the decision and itself separate from the content then it’s easy to make these small missteps that add up to a poor decision.

Host: As we continue looking at the decision-making process, let’s look at it from a different angle, and that is that sometimes we have a tendency to focus on the ultimate outcome. How can individuals and teams evaluate a decision-making process without focusing too much on the outcome?

Rogers: That’s very difficult, actually. So, the first thing I would say is that most people consider hindsight to be 20/20 vision. It’s kind of a saying, you know, a pithy saying; “Oh, it’s 20/20 hindsight.” Hindsight is anything but 20/20. Hindsight is only 20/20 in what actually happened. Yes, we know it broke, we know it didn’t work, we can see that now. But how it actually got there our hindsight is poor. It’s much more like 20/100.

We’re very poor at seeing how we got to the outcome that we now see clearly what happened. And so, the only real way that I know to help making evaluations of decisions avoiding the outcome bias, as you mentioned, is paying attention to the decision-making process during the process. Now that doesn’t mean we have to write down every thought and every word that comes to our mind along the way, but key things, key assumptions certainly, key decision points we’ve come to, why we’re discarding certain ideas and going with other options, what our thinking and rationale is.

We do this in some way with engineering decisions and design decision trades and whatnot. But in decision-making in terms of management, we often just have a meeting, agree, and go home and the next day we’re not really sure how or why that decision was made but it seems good. Then later, when something comes up, we’re unable to go back and look at how we actually made the decision and parse apart where we either went wrong or what part was right. So, we either throw the whole thing out, call it 20/20 hindsight, and again back to the comments at the beginning because we haven’t really understood what happened, our learning is really, really low, and so we’re aren’t going to actually learn much from that decision process.

We instituted, as I mentioned, a process called pause-and-learn, which encourages projects to stop often and frequently and pause and discuss what happened, what have we done and why did we do it long before you reach launch fever and the post-launch discussions when everything takes a different color now that we see whether the mission has worked or not.

I’ll give you one other example in this category of seeing things. Perceptions that people hold drive a lot of decision-making. One thing we can do that helps minimize this outcome bias that we have is paying attention to the perceptions that we’re allowing to influence our decision.

So, for example, a project has an issue going that may require some more money or more time or may cause a slip to the scheduled launch date. There are perceptions about how that will be viewed by people in higher up positions—management, the center, Headquarters. Those perceptions—true or not true, distorted or not distorted, and they’re rarely 100 percent true—those perceptions heavily influence the decision for how that report will be made, and therefore, it enters into the whole discussion of how the final decision will be made of what is actually done and the risk that’s now being taken.

So, understanding, look, are we being overly sensitive to what Headquarters thinks about this? If this is an issue and it’s a safety, it’s a critical, it’s a mission assurance, it’s unknown, we’re not even sure what it is and it requires more study and we’re not going to be influenced by the perception that it may be perceived badly and sort of change our opinion of it, being aware of that can help us evaluate how we’re making the decision right now and be fully onboard with it. We’ve made this decision based on the evidence that the engineer gave us today and this is what we see and we’re going to take that forward and share that.

Later we see what happened, we see how it works out. We can go back and look at our decision and say, “Well, what did we miss? What did we overlook?” And we can have a learning approach to looking at the decision rather than a yes/no, it was a good or a bad decision, which is often all we’re ending up with.

And as we know, psychologists are quick to tell us most people spend more time agonizing over a decision after they made it than before they made it, trying to justify what happened and why it was a good decision. If we spend just a little bit more time during the decision process, it’s kind of pre reflecting in a sense, reflecting on what we’re thinking right now, making a few notes, putting a few things together which we’ll go back and look at later when we see what happened.

Host: And perhaps along the same vein, what are your thoughts on building accountability into the process?

Rogers: Accountability is one of these overused words in my opinion. It’s a good thing. There’s nothing wrong with accountability, but it’s often almost completely misused in the sense that first we’ll make a decision, and then I’ll hold people accountable for it. That’s not really how accountability works. And the best explanation for this is in a book called The Five Dysfunctions of a Team by Patrick Lencioni. It’s a great little read. It’ll make this subject very clear.

Accountability really works alongside of or in parallel with inclusiveness. And the problem is especially smart people, professional people, people hired with NASA, advanced degrees, and who are educated and well-versed in their field and have knowledge that they bring to the table and bring to the discussion, it makes it difficult to hold people accountable for something, or they refuse to be held accountable for something which they don’t feel like they were part of the decision itself. And sometimes this is pointed out as a problem, like you have to get everybody onboard and you have to include everybody, NASA’s so complicated.

That’s why we hired smart people. We hired really smart people who would think and would think of all the contingencies and the possibilities and help us think through the scenarios. That’s how we get really good solutions. And so, we forget sometimes that there’s a cost to that. If you’re going to hire really smart people and have them think, they’re going to think different things than you might sometimes, which means you’re going to have to spend a little more time making sure we’ve all got onboard and included in the process of how the decision is being made. And then you can hold people accountable because it was our decision or it was a clearly made decision or it was an informed decision, it was an open decision. And almost anything less, it’s almost impossible to hold people accountable for other people’s decisions, especially if they think it was made poorly, made in haste, made for illegitimate reasons, you know, just to save a buck or to save face or to look good or not to put a bad picture forward or to save a launch date that we know is going to get rescheduled anyway.

And then people are being asked to be held accountable for things like that; they will not. They will rebel. They will not behave in the way that we think they should behave to make things nice for us. And so, then it’s blamed on, “Well, people weren’t held accountable.” I think the problem is more fundamental than that. The decision was poorly made, the management or the leadership was poorly implemented, and so it did not result in accountability being able to be held to people rather than we didn’t put a memo out and say you’re now accountable for this.

That’s what I mean by overused. It’s used in a simplistic way as a hammer to sort of pound on people. Accountability is the result of a good decision-making process. People are willing to sign up. And actually, you don’t even need them to sign up. They’re already signed up to a decision they’ve been part of.

Host: The importance of communication is usually a central topic in discussions about lessons learned from mistakes and mishaps. What could you share with us about communication that may sometimes get overlooked in these discussions?

Rogers: Communication does always come up. It’s one of those things that comes up, you know, often in the reports or the mishap investigations that there was a miscommunication and they find evidence of it in some of the reporting or the charts that were presented or someone thought this was what was being done or this was the requirement and then it was miscommunicated or something. I think the thing that’s overlooked is that communication risk exists in and around all our projects simply because there are human beings involved with each other and we have risk errors in our communication all the time.

They don’t just show up in one-off events or at major mishaps. They’re like the water we live in or the air we breathe. And so being aware of that, saying, “Look, there’s a risk every day you come to work in your project that somebody has an issue that they may not feel comfortable raising.” It may be because of a perception that they’re not in authority enough to raise an issue with you, maybe their personality, they tend to be quiet and reserved and keep things to themselves. It may be because they have previous experience; they brought an issue up with a former supervisor or manager who berated them or treated them badly and so now based on that experience they’re not going to ever bring any—you don’t know any of those things. And you don’t know when they’re going to possibly implement themselves or reveal themselves in your project, in your team, and so it’s the kind of thing that requires constant work.

I’ll give you an example. We tend to think of human things the way we think of—I mean people who are very mechanical and technically oriented, only natural that they tend to transfer that thinking to the way human systems work. So, for example, if you have human biases and human tendencies like we’ve been talking about in some different ways, the idea that you can send someone to a training course and they can learn about human biases and cognitive dysfunctions and all this kind of stuff and that somehow now that person will come back and be immune to any of those kinds of decision-making errors is nonsense. This is not a vaccine program where we send people to training on human behavior and they come back vaccinated and now they’re immune to any human decisions.

You only become aware of what will inevitably happen to you as a human. You will be emotional. You will care about your own thing more than somebody else’s. Your personality will influence the way you interact with people in a meeting. Those aren’t bad in and of themselves, but they bring up communication risk because it brings up imperfections in the way humans will communicate with us. But we can do some mitigations that can help and there are lots of these. And NASA does do some of these and they work at this all the time, reiterating what people say, listening for feedback, independently verifying ideas that other people said, allowing for independent verification through different channels—you know, engineering, safety and project independently report and having Technical Authority. We’ve done a lot of things that help, but those things all help if we do them. Because they’re written in a policy paper somewhere, that does absolutely no good unless people are actually doing them as practice. And they don’t have to do every single one religiously, but adopting the attitude that this is a risk, we’re going to have communication risk on our teams and in our groups. If we don’t do anything it will at some point create vulnerabilities that may cause something bad to happen or cause us to make a very poor decision.

Host: When you reexamine the findings from the Columbia accident and investigation, what else stands out about the human element?

Rogers: I think one of the things that stands out, and I think I said it earlier, is that making human decision-making errors doesn’t make you a bad person. It just makes you a person. Makes you a human. We all have the same susceptibilities and there is no such thing as perfect decision-making, just like there’s no such thing as perfect information.

So, I think one of the things that stands out in the human element is being aware of the risk that one of the greatest risks to project success is an assumption that I know more than I actually know. That’s not what some people would write down as a risk. They would say, “Well, we’re smart, we know the most about this subject.” But assuming you know a little bit more than you actually know creates a danger in a human element because it creates a perception that other people will listen to you, you will listen to you, and even more importantly, you may stop asking questions when asking questions is what actually needs to happen the most.

This is where our independent review panels come in because they come in and ask questions. And it isn’t the purpose of the panel to find tricky questions that the project doesn’t know the answer to just to catch them up. It’s to ask questions that they maybe should’ve been asking themselves. In some cases, they are, and they just need to focus on it.

But asking hard questions about, well really? How do you know this process is the best one? How did you decide to go this direction in the design versus that design? What’s your justification for doing this many tests on it or this life cycle process or whatever? And asking those questions from a third party and independent can uncover some of those human tendencies that we have that we all experience. And the longer we’re together and the longer we’re in a project and the more tension, the more pressure we get as we get closer to launch, the more vulnerable we actually get to them. And so being aware of that, I think which leads us to where we are today actually with our upcoming NASA programs.

Host: And how can NASA learn most effectively from Columbia now as we embark on the next great challenge of missions to the Moon and Mars?

Rogers: Well, I think that’s a big question that NASA needs to look at, and I think the first answer is what we’ve already discussed in terms of reflecting. We need to reflect closely, and NASA’s been doing this with the Columbia tour, teaching lessons learned about Columbia through APPEL programs and around at different centers. They’ve made a big effort to do this, which is I think very good, because the closer we get to understanding what happened and realizing the role we humans played in it the more likely we are to learn those lessons, be aware of them, and then be ready to recognize when those symptoms may be appearing in our current processes.

Will undue pressure, will overconfidence, will bias in decision-making be at play as we go forward to try to go to the Moon and to Mars? Absolutely. The question isn’t whether those things will exist in our presence and in our meetings and in our teams. The question is whether we will detect the subtle things that tell us we may be moving in a similar direction or similar decision-making process. And if the more aware we are of how things happened on Columbia in the background, in the culture, in the structure, in the decision process itself, the more likely we are to recognize that we may be crossing a line or moving in a direction that’s not healthy or needing to just pause and think about what we’re doing, doing some little added reflection or getting an outside opinion, all the things that we know to do.

We just need the right triggers to say, “Now it’s time to stop and do that and make sure that we’re staying on the course.” And I think we’ve got a lot of processes in place with technical authority review processes, independent verifications that we do, but we need to believe in them. We do those processes, as messy as they are and sometimes as complicated as it makes or expensive as it makes our missions, we do them because we know it leads to better results. So, I guess the sort of biggest lesson to take from Columbia is not necessarily what we did wrong. It’s all the things that we do right and we know that help us have success, and make sure we hold on to them and hold those principles as tightly as we can.

Host: The Columbia Accident Investigation Board’s final report stated NASA’s current organization has not demonstrated the characteristics of a learning organization. How has the agency changed since the CAIB report was released in August 2003?

Rogers: I believe it has changed. There are some things we’ve already mentioned; the strengthening of Technical Authority, the NASA Engineering and Safety Center, which was set up which does a marvelous job of training and reapplying the agency’s knowledge to make sure we’re aware of the best practices, also the mitigating tough decisions when people are at an impasse and they use their expertise to come in. So, there have been a number of things, structural things like those that the agency has done to try to mitigate it. Also, dissenting opinion process and other things.

But I think fundamentally there’s an openness to learning that’s recognized. Partly that’s a newer generation that’s recognized the openness to learning and figuring things out, but my observation is a little bit hesitant because I don’t believe it’s evenly distributed. In other words, it’s not like the whole agency and every person has moved the same degree towards an openness to learning. It’s pockets and it’s spreading, and I think it’s getting better and it’s being recognized as important, but there certainly is variance in the amount of that. And I think that’s the continuing challenge for the agency to continue to focus on this thing from leadership and from those of us involved in these activities, knowledge management, to make sure we keep this at the front of us, that this is not a job that is one and done.

We don’t just have a VITS conference on lessons from Columbia and then we’re done with it completely. We will constantly need to be looking at these things and keeping on our toes and carefully reflecting on the decisions that we do make going forward as we make them to see if we have incorporated the things that we know we should in the decision process itself. That being said, there’s also a little bit of skepticism in some quarters towards this learning, fuzzy-wuzzy stuff, you know, soft skills things.

You may think I’m questioning that but I’m actually OK with that. I think it’s good to be skeptical of things unless they are proven to be of value, because at the same time we don’t want to be adopting all kinds of management fads and tips and tricks and codes of dress or ways to write memos or all sorts of almost superficial kinds of things that are touted as solutions to NASA’s problem when they really aren’t. They may be interesting, they may not be harmful, but you know what I’m saying? They’re overly touted consulting kind of fads that people might latch onto and say, “This will fix everything you’re doing.”

I think we need to own our future. We need to own our past. And we need to own the lessons that we can learn from the past and apply them to the future. I think that’s our responsibility.

Host: Ed, thank you so much for sharing your insight and your expertise on Columbia, the lessons learned, on human behavior. We really do appreciate you joining us today for the podcast.

Rogers: You’re most welcome. It’s been a privilege to work with all the smart people at NASA for these years and it really is a marvelous place to work.

Host: Do you have any closing thoughts?

Rogers: I think there should be a lot of positive optimism for NASA’s future. They may do different things in 10, 20 years, but I think the possibilities are very high. I think the people are very bright. The willingness of people to learn is also extremely high. But we have a lot of work to do in this realm. It will not be solved—these problems of learning from decisions don’t happen by themselves—unless we keep working at them continuously.

Host: You can learn more about Columbia lessons learned and the Space Shuttle Columbia National Tour via links on our website. Ed is speaking at the Columbia national tour stop at NASA’s Marshall Space Flight Center November 4-8. Additional resources are available specifically for Marshall employees, and you can find links along with Ed’s bio and a show transcript on our website at That’s APP-EL-dot-NASA-dot-gov-slash-podcast.

If you have suggestions for interview topics, let us know on Twitter at NASA APPEL, and use the hashtag SmallStepsGiantLeaps.

We invite you to take a moment and subscribe to the podcast and tell your friends and colleagues about it.

Thanks for listening.