By Kerry Ellis
On January 8, 2007, twenty-three participants in Glenn Research Center’s Space Mission Excellence Program met for the mock preliminary design reviews they had been preparing for the past few weeks. When Harvey Schabes, one of the guest review board members, asked after one presentation, “Why did you pick the most expensive options each time? Is there not a risk of running out of money on this?” participant Leah McIntyre responded with a smile in her voice, “Because we want the mission to succeed.” The instructor, Joel Sercel, replied quickly, “The right answer is, you tell the board, ‘When we engaged with you and the key stakeholders you identified, you all told us cost was not an issue. What you care about most is risk and schedule. These figures reflect that.’ At that point, you find out if that was really their criteria. That’s the power of this methodology.”
For more than sixty years, Glenn has excelled at research in aeronautics; the generation, storing, management, and distribution of power for space systems; electric, nuclear, and chemical propulsion; communications; and microgravity science. Glenn has also played a key role in developing engines and rockets that have launched NASA missions into space, but over time it has lost experience creating aerospace systems that will operate outside Earth’s atmosphere. As the Agency turns to the Constellation program and the new Vision for Space Exploration, it is looking especially for the skills required to create new space vehicles. Senior management at Glenn realized they needed new types of experts—systems engineers—to make a significant contribution to Constellation.
In March 2006, Glenn Deputy Director Richard S. Christiansen asked the center’s Organization Development and Training Office (ODTO) to help find and prepare candidates for the challenge ahead. The office responded by creating the Space Mission Excellence Program, also known as SMEP, which is currently led by training specialists Kathy Clark and Adam Ross. “We needed to produce highly trained systems engineers in a short amount of time, and we were creating a program from scratch,” Clark explained.
Through courses that would offer hands-on experience and in-depth guidance from seasoned systems engineers, SMEP would strive to provide Glenn with systems engineers who possess leadership and communication skills as well as technical expertise. “It is important for systems engineers to be able to work with the project as a whole, including the human interaction involved,” said ODTO Chief Cindy Forman. They also wanted to expose program participants to new and different perspectives on systems engineering and help them recognize and define problems and potential solutions.
The program would also provide participants with mentors, but in an unusual way. Martin Forkosh, SMEP’s workforce development manager, explained that the program differs from other training programs because it assigns participants to a variety of projects as systems engineers. “We take them out of the classroom and provide them an avenue for real, hands-on experience,” Forkosh said, and he played a key role in making this lofty goal a reality. “We wanted a workforce development manager who had contacts across the Agency, was an expert in his field, cared about training, and could find projects for participants to join,” said Ross. “Martin had all that and was previously chief of systems engineering, so he understood what we needed from candidates to get them reassigned.”
With a bit of trepidation—because they were uncertain how the experienced workforce would respond to taking on “green” members and helping them learn the ins and outs of systems engineering—the training team began asking program and project managers if they would enlist the SMEP recruits for their projects. “The response from project managers was huge; it was really quite fantastic,” Forkosh said. Many eagerly took on the SMEP members and placed them on projects to work and gain hands-on experience. All participants work on these projects while they are taking in-classroom training, applying what they learn from both avenues of instruction.
The participants come with a variety of backgrounds and experiences. Charles Farrell spent thirty years in fluid dynamics, software development, management, and software engineering. He joined SMEP because he felt systems engineering best practices were more advanced than those in software engineering, and he wanted to apply his learning back to software engineering. Kathy Shepherd was previously a project manager for the Exercise Countermeasures project, part of a program at Johnson Space Center to study the effects of exercise on astronauts in space. She was looking for a change and wanted to expand her knowledge of NASA, referring to herself as one of the “new kids on the block” because of her six-year tenure with the Agency. James Scott spent twenty-three years researching in aeronautics and acoustics and was at a point where he wanted a transition in his career and could make it. “The mechanism to make such a change had never existed before, so I was excited about the program,” he said. Another participant started down the systems engineering path two years ago; others are still acclimating to a shift made only a few months ago.
Together the participants work on elaborate assignments that expose them to areas of systems engineering they may not have experienced yet in person. In early January 2007, I was invited to observe the outcome of the first of these assignments: a mock preliminary design review (PDR).
Engineer for a Day
The first thing that struck me when I arrived for the PDR was the sense of community in the room. Conversations started up as people began filtering in at 7:30 a.m. and delivered donuts, homemade brownies, and other fresh-baked goods to a small table. Cheery “good mornings” peppered the air as people caught up and reviewed their notes before presenting the findings of their PDR.
As a non-engineer with a limited understanding of what goes into building spacecraft, I was concerned about my ability to follow the class and understand what was being presented. When I shared my apprehension with a couple of students near me, they candidly said they hoped they could answer my questions because they were also new to this and learning so much themselves. Reassured, I asked the most basic—and important—question I needed answered: “What exactly is a preliminary design review?”
The easiest way to define it is as a comprehensive, extremely detailed sales pitch. Participants were divided into two teams as if they were contractors competing for a $170 million project that would span six years. Each team would have about three hours to make its case before a review board about why it should be awarded the contract. For comparison, one participant told me that the actual PDRs for a couple of racks flying on the Space Shuttle took four days to present and probably six to nine months to create.
A PDR includes a massive amount of information. In addition to meeting the client’s requirements, a contractor needs to show how it will garner resources to build the spacecraft and operate the mission, predict potential risks and explain how they will be reduced, project the overall budget and timeline, define critical milestones, and much more. Most of content presented in PDRs paints the big picture and provides supporting details about what is needed to build, launch, and relay and analyze data for a long-term mission—important information for a systems engineer to understand, since presenting and defending the project requires a grasp of the entire project and how the parts of it fit together and affect each other. To prepare for their PDRs, participants relied on requirements distributed in their previous class and information in documents or on the Internet from real PDRs of projects similar to the hypothetical one they were assigned.
The review board—there to test participants’ understanding of the process, ask pointed questions, and deliver immediate feedback—included CalTech course instructor Joel Sercel, Engineering Development Division Chief Dan Gauntner, and Center Operations Deputy Director Harvey Schabes. Welcoming the participants the first day was also Ricky Shyne, Deputy Director of the Engineering Directorate. As students from each team presented their sections of the PDR, time was reserved at the end for the review board to ask questions and make observations, detailing what had been done well and what had been missed or could have been done better. Participants were often asked to respond to questions as if they were in a real PDR and respond again as a student to clarify what they were learning and what would happen (or should happen) in an actual PDR.
The practice also underscored the importance of predicting, defining, and mitigating risk to ensure teams had plans of action for any contingencies. After James Scott presented his team’s risk findings, Sercel clarified some of the confusion about identifying risks: “A risk is an event or condition that has a probability or a consequence. Cost overrun is not an event, but a consequence. Meteor damage to the antenna, however, is a well-defined risk. If a risk can be applied to any project, it is too general and therefore inaccurate.” Gauntner elaborated with a quick tip to help participants delineate the difference, “To define a risk, I find it helpful to start with a statement and then ask ‘why?’ five times to find the root cause. That is your risk.”
Stories from personal experience bolstered the generalizations. To help define an acceptable percentage probability for risk, Sercel shared one story about a JPL mission that was developing a composite propellant tank. The engineering team for the tank guesstimated that there was an 80 percent chance the tank would be delivered on time, “which was completely unacceptable,” Sercel said. So the team fully funded a back-up titanium tank in parallel.
Interaction among the participants and with the review board members was open and honest, with students challenging some of the board’s feedback and even inspiring some friendly debates among the board members about engineering requirements, the best way to accomplish goals, and new approaches to standard PDR requirements. Student-to-student conversations sprang up during the question-and-answer sessions as they discussed details and incorporated the board’s feedback into upcoming presentations. Though the exercise had been playfully described as a competition between contractors, it was obvious from the exchanges that there was one community in the room.
Each team learned from the other. They discussed the differences in their approaches and what they had learned from each other over lunch, so they all reached a cohesive understanding together. After the presentations were completed, the review board assigned a few “requests for action” and asked the teams to revise some slides for the next day to ensure they understood the feedback they had received the first day. The next morning, the line between teams was further dissolved as the entire group worked together to revise portions of their presentations. Gauntner observed these interactions as well, saying at the end of the exercise, “Everyone showed remarkable teamwork and camaraderie, which will be a benefit for exploration.”
Using their new hands-on experience to bolster the PDR exercise and taking lessons learned from their mock PDRs back to their projects creates a rich and ongoing educational experience for the participants. The students also discovered they were using their previous work experience in ways they had not anticipated. Shepherd said that her previous project management experience helped her see the whole picture that systems engineering required. Scott said he was surprised he was using much of his research background in systems engineering, because systems engineering is a completely different discipline. He also shared with me a realization we’d both come to that day: “You just can’t build spacecraft without systems engineering.”
Future Generation
SMEP has more experience and mentorship ahead for the participants. By the time the program concludes at the end of 2007, Glenn’s office of development and training hopes its program of formal training, hands-on experience, and in-depth exercises will have helped the next generation of systems engineers contribute to the Vision for Space Exploration.