Paul McConnaughey discusses knowledge sharing at NASA’s Marshall Space Flight Center.
Paul McConnaughey serves as the Chief Knowledge Officer (CKO) at NASA’s Marshall Space Flight Center. McConnaughey is the Associate Director, Technical, supporting the Office of the Center Director at Marshall. He performs special studies, advises and assists in policy review, manages and reports on center-wide metrics, and develops new technical benchmarking strategies.
McConnaughey joined Marshall in 1986 as a Mathematician in the Systems Dynamics Laboratory and soon advanced to supervisory positions, including Team Lead, and later Branch Chief and Division Chief. In 2012, he became the Chief Engineer in NASA’s Exploration Systems Development Division, where he was the Lead Technical Authority on all technical and engineering matters.
Where does the knowledge management function reside at the Marshall Center?
Organizationally, it resides in the Associate Director, Technical Office, but the work is being done by Jennifer Stevens and her group in the Chief Engineer’s Office. Jennifer leads the execution of knowledge services for the center. And a number of people support the activity from the perspective of knowledge capture, lessons learned and the Distilling Team, which has members of major organizations, programs and engineering. I work it more from the policy and general guidance perspective.
Who are the biggest users of the knowledge services in your organization?
It’s predominantly the engineering group, but there are a lot of pockets of groups that really do use it. There are a number of different groups in engineering that are really engaged. Our Deputy Director for Engineering has really engaged with knowledge. One of our priorities is to focus on communities of practice, and we’re developing communities of practice and communities of interest in various areas. It is an engineering-driven focus, which you might expect with engineers making up more than 50 percent of Marshall’s workforce.
The project offices have understood that learning their lessons and getting smart before they start is a really good approach. So we have pockets of groups that really do engage and try to figure out what lessons are out there before they start. Even in the support areas there are key groups that internally pause and learn to make sure they’re capturing lessons. I think we’ve really got a cultural grass roots effort of capturing these lessons and it’s just increasing and becoming more and more an organic part of the culture. And people in the major programs such as Space Launch System are trying to capture, ‘What did we do right? What did we do wrong? What can we do better next time?’
What are some of the most prominent knowledge challenges in your organization?
Capturing the corporate memory before it retires is the No. 1 challenge. I’m not sure we’re doing a great job of that. I think everybody’s aware that’s an important challenge. A lot of processes are put in place to try and address that, but even at the engineer level, people retire, and we just have no way to always formally capture their knowledge. They take it with them, and sometimes they don’t write it down to flow to the next generation of engineers. Most of the time they do, but that’s an ongoing challenge.
Is there a specific project or set of lessons learned that influences your approach to solving problems?
I would say that as opposed to what I call the general, broad lessons learned database for the agency, there are lessons learned for either failure investigations or major studies by external commissions that I think are very important that we go through and review as comparison for our current activity. A great example would be post-Columbia or post-Challenger — there were lessons learned. Post-Augustine Commission – there were lessons learned. Constellation cancellation — there were lessons learned. One of the things I take quite seriously is taking all those lessons learned and making sure we don’t revisit those real-time in our current programs due to lack of rigor.
Another thing that’s being done through the knowledge management system is that Jennifer Stevens is doing the case studies. I think those are incredibly valuable and are a good method for transferring lessons learned in a very lucid and applicable way. I think that is an excellent means of communicating our lessons learned. It’s different than an external investigation or some failure report. The case study is more of an engineering application of lessons learned that you’d put in undergraduate or grad school. It’s such an invaluable tool, and I’m excited to see that growing.
The third thing is the Distilling Team that’s been implemented at Marshall. It’s very useful in the sense that there is a local lessons learned structure. If we see an “oops” or “something happened,” that immediately then is vetted by the Distilling Team to see what really is the true lesson learned. And there’s accountability for the Distilling Team to apply that across the organization and put forth any resulting action at Marshall and the agency level.
Do you have a favorite story or example of tangible benefits of knowledge sharing?
My last job before this was being Chief Engineer and SE&I lead at NASA Headquarters for integrating SLS. And in order to check ourselves, we went through all the lessons learned from shuttle and Constellation, and measured ourselves in respect to what we were or were not doing to follow lessons learned from those two programs. Those were the nearest ones. One was very successful and they chose to flow it into policy. They did have a lot of lessons learned because there were a couple of major flight failures. The other one had programmatic challenges.
So we went through — literally one-by-one — all the lessons learned and then compared how SLS was lining up against the program and addressed each one of our concerns. We mapped all of them, and I think we were at about 95 percent at covering the technical issues with lessons learned. The bottom line is that after looking at the scrub, the team had done a very good job of projecting those concerns explicitly. I think part of the reason we did is because both of those experiences were so recent and raw. With the closure of shuttle and the failure of Columbia, people within the agency currently carry that corporate memory very close. And with the issues with Constellation, a lot of those people are still around and so they shared that knowledge.
What are the biggest misunderstandings that people have about knowledge?
People seem to think, ‘If I build a database, they will come.’ We know that’s not true. And then people perceive lessons learned as, ‘Oh, I have to write something down at the end of my program and throw it into a database.’ And that’s true at one end of the spectrum. But I think the teams at the agency level go beyond that, and we need to close that gap with programs and projects across the board.