Back to Top

Tap into the experiences of NASA’s technical workforce as they develop missions to explore distant worlds—from the Moon to Mars, from Titan to Psyche. Learn how they advance technology to make aviation on Earth faster, quieter and more fuel efficient. Each biweekly episode celebrates program and project managers, engineers, scientists and thought leaders working on multiple fronts to advance aeronautics and space exploration in a bold new era of discovery. New episodes are released bi-weekly on Wednesdays. 

NASA Armstrong Operations Engineering Branch Chief Kate McMurtry discusses the value of understanding the impact of human factors on mission performance.

Any time a mishap or close call occurs, there’s always a human component to consider. The NASA Office of Safety and Mission Assurance Human Factors Program promotes a better understanding of the impact of human factors in an effort to improve organizational performance and prevent recurrence of circumstances that contribute to mishaps. At NASA’s Armstrong Flight Research Center, training plays a key role in the emphasis on human factors.

In this episode of Small Steps, Giant Leaps, you’ll learn about:

  • Human factors that are most likely to lead to incidents
  • Benefits of increased attention to human factors in engineering
  • The human factors training initiative at NASA Armstrong

 

Related Resources

NASA Human Factors Program

NASA Armstrong Human Factors Program

Federal Aviation Administration: Avoid the Dirty Dozen

APPEL Courses:

Critical Thinking and Problem Solving (APPEL-CTPS)

Pay It Forward: Capturing, Sharing and Learning NASA Lessons (APPEL-PIF)

Lifecycle, Processes & Systems Engineering (APPEL-LPSE)

 

Kate McMurtry is the Operations Engineering Branch Chief at NASA’s Armstrong Flight Research Center. Selected as chief in 2014, McMurtry is responsible for planning, directing and coordinating the technical and administrative functions for the branch, which provides sound engineering to ensure airworthiness throughout planning, integration and flight of unique systems and flight vehicles. Prior to her current position, McMurtry served as the deputy branch chief for two years. She joined NASA in 2008 as an operations engineer supporting the agency’s F-18 and F-15 research aircraft. McMurtry started her career in 2004 as a U. S. Air Force officer working developmental engineering at Edwards Air Force Base, California, for the Airborne Laser Program, and ended her military career as a captain. She has a bachelor’s in chemical and petroleum engineering from the University of Pittsburgh, and a Master of Business Administration and a Master of Arts in Management and Leadership from Webster University.


Transcript

Kate McMurtry: Experience or expertise doesn’t equal perfection. Mistakes will come. It’s your willingness to learn from those mistakes that decides if they will empower you or they will defeat you.

One way to recognize the value human factors plays as part of a risk-informed decision process is through the sharing of lessons learned, and reflecting on your own mistakes, and openly reviewing incidents that you’ve experienced as a team or as an organization.

The organization needs to make human factors part of everyday conversation.

Deana Nunley (Host): Welcome back to Small Steps, Giant Leaps, a NASA APPEL Knowledge Services podcast that taps into project experiences to share best practices, lessons learned and novel ideas.

I’m Deana Nunley.

The NASA Office of Safety and Mission Assurance Human Factors Program helps NASA’s technical workforce understand the impact of human factors – considering human factors preventatively for improved organizational performance as well as identifying human factors retrospectively in mishap investigations.

The Human Factors Program supports all NASA missions and strategic efforts, including centers and mission directorates. At Armstrong Flight Research Center, training has played a significant role in the emphasis on human factors.

Armstrong Operations Engineering Branch Chief Kate McMurtry joins us now to share insights from her experience. Kate, thank you for being our guest on the podcast.

McMurtry: Oh, thanks for having me.

Host: Human factors is a fairly broad term. What’s a good working definition in the context of NASA missions and our conversation today?

McMurtry: Well, at Armstrong, we execute the first A in NASA, which is Aeronautics. For us, that means our Flight Operations organization requires a significant amount of people and aircraft or system interactions in order to perform our mission, which is advancing technology and science through flight. So human factors takes into account this people-system interaction, to determine ways to optimize the wellbeing of both.

The system often doesn’t perform at its best when we aren’t performing at ours. So human factors can impede our performance and it can look like stress, lack of resources, pressures, whether real or perceived, and distraction, to name a few of them. These, with eight others, make up what’s known as the dirty dozen and are considered common contributors to human error in aviation and maintenance.

Host: So, let’s talk about the human factors initiative at Armstrong. What prompted the increased attention on human factors?

McMurtry: Well, taking research and technology to new heights and uncovering new science means learning from failure is part of our iterative process towards success and discovery. And thriving in a learning-from-failure environment requires a lot of resilience to maintain focus through all of those iterations. That resilience looks like striving to understand what worked as well as what didn’t work, and using that knowledge to implement a smarter approach.

So, we aim to demonstrate this resilience throughout all aspects of our work, to include maintenance and our general flight operations. So, in the event of an operations- or maintenance-related incident, our Safety and Flight Operations organizations partner to evaluate the causes and the contributing factors, and assess how we can keep people and our systems better protected. So, with this effort, casual trends were recognized that mapped back to the dirty dozen that I talked about earlier.

So, when leadership asked the question — “How do we take the knowledge of these trends and implement a smarter approach?” — the human factors training program was initiated. And our goal is to build a habit of considering human factors in our daily decisions through education, open dialog, and practice.

Host: So, you actually have a formal human factors training program at Armstrong.

McMurtry: Yes. It was just initiated in the last year and a half or so.

Host: So, how did you get involved?

McMurtry: Well, this was a top-down initiative across our Safety and Flight Operations leadership, who encouraged first-level supervisors such as myself and several others to take the lead. I really feel like this was the right approach. First-level supervisors have the appropriate vantage point to understand the daily challenges of their unique organization, and they also have the established relationships with employees to empower open dialog.

So, first, our supervisors were trained in human factors. Then, collaboratively, we developed a training program to implement across maintenance and engineering organizations within our Flight Operations Directorate.

Host: What do you see as the biggest contributions or benefits of this emphasis on human factors?

McMurtry: I see several benefits to this emphasis on human factors, and one of the biggest has been open dialog. Our training program has two parts, involving teams of no more than about 20 people. Part one is academic instruction from an external subject matter expert instructor, and the second day is a tailored discussion led by first-line supervisors. It’s geared towards applying the human factors discipline to real Armstrong scenarios, uncovering the barriers through open dialog that can help us develop smarter approaches moving forward.

To foster open dialog, we incorporated the sharing of our own personal human factor lessons learned that has helped each of us as supervisors find smarter approaches to how we perform our own work. We feel it’s extremely important to recognize and acknowledge that even though we’re leading this training as first-level supervisors, it applies to us, too.

It’s equally important to emphasize that being open to learning and applying those lessons goes a long way in building trust and getting back on track of success following an error. So, this has encouraged open dialog. Instead of finger pointing, we see accountability. We see an openness to listen. We see people sharing and learning from each other. This is the kind of dialog that hits home for a lot of people and has really helped bring human factors to the forefront of our own minds.

Notably, we’ve also gained the opportunity to learn how our leadership decisions can unintentionally contribute to human factors tendencies within our own organizations. For example, we learned that mechanics considered the fall protection equipment cumbersome and hard to use, which tended to make them feel less in control of their tasks and, therefore, distracted and susceptible to mistakes. So, figuring out ways to upgrade to alternative equipment, in other words, giving them the right resources, was an opportunity for a smarter approach.

Host: That’s really interesting. What are some of the other takeaways from human factors training and experiences at Armstrong that might be helpful to engineers and PMs across NASA?

McMurtry: First is to recognize that team members whose role primarily supports the execution phase of a project are often dealing with both the good ideas and the shortcomings of decisions made earlier in the project life cycle. So, though errors may look like that of an individual, that person or team may be trying to do their best with less than ideal circumstances. So early project life cycle decisions, whether it be poor requirements development, minimal funding for resources or lack of schedule buffer, et cetera, can play a significant role in creating human factor causes later on. It’s important to check in with your teams and end users and initiate dialog early and often, to help uncover these latent human factor risks.

A second takeaway has been to demonstrate ownership and trust by sharing your own human factor lessons learned, and supporting others when they find the courage to share theirs, and emphasize the lesson learned over the mistake where appropriate.

Host: You mentioned the courage to share. Has it been difficult to get people to have that courage, and to step forward and share some things that have gone wrong or some things that they’ve learned about themselves in terms of human factors?

McMurtry: I think sharing a mistake is always a vulnerable task for anybody. We realized this early on and decided that we would be the first ones to share our lessons learned story with the teams that we were bringing through this training program. It’s shown time after time — we’ve had a dozen or so of these training events — that once we open up to them and let them know that we do value, we sincerely value the storytelling and we are not perfect ourselves, it opened up the door for other people to be able to put their own vulnerability on the table and share when they’ve learned from a mistake.

Once you get one or two people opening up, it really opens the floodgates of conversation. So as long as leadership takes the first step and sets the example, we’ve seen that it’s really easy for others to share.

Host: Where have you observed breakdowns or vulnerabilities that you might not have noticed as readily, if you weren’t aware of the critical role of human factors?

McMurtry: NASA is an organization comprised of people who go above and beyond. They think creatively. They strive to achieve results. Although these are strong attributes of success, they also can become vulnerabilities. People come up with creative ways to get work done, in spite of human factors such as lack of resources or amidst perceived or legitimate schedule pressures, or even when they’re not feeling their best, because they really don’t want a mission to delay.

So sometimes this can-do attitude can crowd out room to momentarily pause and ask oneself, “What risks am I adding to myself or to others or to the mission by, for example, not using the right tool for the job, or not staffing the team to have a trained backup person, or scheduling a critical task without schedule buffer?”

Host: Once an individual gets a clearer or deeper understanding of human factors, what advice would you offer for then translating that knowledge into good habits and best practices?

McMurtry: Adopting a new habit or best practice requires recognizing its value and practicing that new mindset or behavior. So, one way to recognize the value human factors plays as part of a risk-informed decision process is through the sharing of those lessons learned, and reflecting on your own mistakes, and openly reviewing incidents that you’ve experienced as a team or as an organization. It’s easier to resonate with the value of changing thoughts and behaviors when the risks of not doing so hit really close to home.

With practice, we’ve emphasized the power of the pause in our training by instituting this mental checklist using the acronym PACE: P for pause, A for assess, C for communicate, and E for execute. We encourage employees to pick something in their daily routine, whether at work or at home, to practice PACE before they initiate it. So, this, we hope, will help build a new habit, a new mental habit of stopping and assessing to mitigate risk, communicating the implementation plan to others, if others are involved, and then executing that task. So especially for routine tasks, this creates the moment to consider what personal or environmental factors may be different this time than the last time I performed this task that could be adding risk to the task.

Lastly, the organization needs to make human factors part of everyday conversation. In Flight Operations, we’ve nearly completed taking hundreds of people within our organization through our human factors training curriculum, and we’ve begun discussions on what follow-on steps should look like, because we believe that this cannot be left alone as a one-and-done conversation.

Host: Does the application of human factors look about the same across the wide range of NASA missions or are there noticeable differences?

McMurtry: I would say there’s both similarities and differences, similar in that you can’t escape the application of human factors when people are involved. However, differences may exist in the types of human factor considerations in how they may produce the best product. So, for example, on the design phase of a NASA mission, a question might be, “What requirements do I need to design for to make a system easy to remove and install, without creating a hazard for the user or risk of damage to the unit?” Whereas in the execution phase of a mission, a question might be, “Do all team members understand the objective, and is each person trained for the task?”

Host: What role do you think human factors will play in the future of engineering?

McMurtry: Well, the discipline of human systems integration, or HSI, is not a new aspect of engineering. It involves questions like, “How do I design features to prevent the end user from installing something incorrectly,” or, “How do I make features intuitive, so that there’s a higher probability of proper system function?”

It’s important though, also, to build resilience into the engineering discipline as well. In working closely with our Armstrong Safety and Mission Assurance Organization, I’ve learned of new bridges between human factors and engineering called resilience engineering. Where HSI focuses on creating and maintaining an ideal system performance environment, resilience engineering is about designing in the ability to respond to, and especially recover from, a disruptive event.

So, I previously mentioned that a key element of resilience is to be upfront about evaluating error in order to make improvements. However, this may also drive a tendency towards procedure and process rigor that while reducing or stabilizing errors or accidents, it carries the potential to lack flexibility to respond to system changes that are inevitable in complex systems. And at NASA, we do a lot with complex systems. So, the value of resilience engineering can be exemplified through what’s happening in our local, national and global communities now. The key is moving away from process-driving decision making and cultivating a risk-informed decision-making process, so that process rigor can be situationally tailored.

Host: How has the emphasis on human factors changed the way you view your job and do your work?

McMurtry: My involvement in human factors has emphasized that experience or expertise doesn’t equal perfection. Mistakes will come. It’s your willingness to learn from those mistakes that decides if they will empower you or they will defeat you. It also emphasizes the second order effects to our decisions that, as leaders, we don’t readily realize when we’re in the habit of reacting instead of reflecting. So, there’s a lot of power in that pause.

Host: This has been really fascinating and interesting, helpful information that you’ve shared with us today. I really do appreciate you being with us. Is there anything else before we close?

McMurtry: No. I really enjoyed reflecting on this with you today and sharing a little bit of perspective on what we’re doing at NASA Armstrong in the realm of human factors.

Host: You’ll find links to topics discussed during our conversation along with Kate’s bio and a transcript of today’s episode on our website at APPEL.NASA.gov/podcast.

If you have suggestions for interview topics, please let us know on Twitter at NASA APPEL – and use the hashtag SmallStepsGiantLeaps.

Thanks for listening.