Back to Top

Download PDF

By Charles Tucker

“I tell people I’m a true geek,” Jay Pittman says, laughing. He’s driving on a two-lane strip of blacktop flanked by summer-green crops, heading seven miles southeast from the main base of Wallops Flight Facility toward a tiny barrier island off Virginia’s Eastern Shore, where the Wallops launch and research range stretches along a sandy strand of the Atlantic Ocean.

“I’m a computer science mathematician,” he adds, by way of explanation.

Pittman is also chief of the Wallops Range and Mission Management Office. Before taking that job in January 2002, he ran a systems software engineering group in the engineering directorate of Goddard Space Flight Center, where he led teams of civil servants and contractors providing “end-to-end” software services to Wallops missions.

“That was exciting!” he exclaims, as if to reinforce his self-described geekiness.

But if Pittman gets jazzed reminiscing about software engineering, it’s nothing on the order of his enthusiasm for his current post. “Honestly, this is the best job in the whole world,” he says.

We got compliments from our external review party … and something else—there was a constant reference to lessons learned from past missions as well.

How did a computer geek end up doing rocket stuff? The “end to end” comment tips his hand. “Even as far back as college”—he’s a Virginia Tech alum—“I really didn’t care that much about the software itself. What I really enjoyed was the process.” The getting there, from one end to the other.

Across the past six decades, Wallops—the only launch range owned by NASA—has been the site of more than 16,000 launches, from sounding rockets and balloons to orbital launches. By virtue of the facility’s small size, nimble and low-cost operations, and, to use Pittman’s term, “super-responsiveness,” the process is unique.

The TacSat-2 launches from Wallops Flight Facility.

The TacSat-2 launches from Wallops Flight Facility.
Photo Credit: NASA

In one six-month span, from December 16, 2006, to April 24, 2007, the Mid-Atlantic Regional Spaceport, Pad 0B, at Wallops Island was the site of two orbital launches. Both the TacSat-2 (Tactical Satellite-2) and NFIRE (Near-Field Infrared Experiment) missions were launched on Air Force Minotaur I rockets—TacSat for the Air Force Research Laboratory, NFIRE for the Missile Defense Agency. Both launched on schedule to the second.

Their success hinged on the ability of the project teams to get the missions off the ground quickly: seventy-two days for TacSat-2 from the time of delivery of launch vehicle; forty-nine days for NFIRE. Achieving that quick turnaround depended on an apparently paradoxical type of project management—tight supervision and democratic participation—in a style that suits Wallops’s soup-to-nuts approach.

“Almost all our projects are concept to launch—end-to-end projects,” Pittman says. “It’s an extremely dynamic process.” Which makes the range chief and the range a perfect fit. “That’s one of the best things about this job: the opportunity to sort of sit in the midst of these project managers, to be responsible not only for watching over these projects as they get to completion but then, at the end of a mission, to get back in and sort of push out all the experiences to the other project managers in such a way that everybody gets better.”

For Pittman—the man managing the mission managers—the process is everything. Following a successful launch, it starts immediately all over again with the lessons learned from the mission that just concluded. From Pittman’s perspective, the success of the first Minotaur launch was “really only complete when we did it again with NFIRE. The lessons learned from TacSat-2 were a big part of the success of the follow-on mission. We kept those in front of us the whole time,” leading up to the second Minotaur launch four months later.

Befitting a computer science mathematician, Pittman takes a pragmatic, stepwise approach to the problem of converting lessons learned from a static collection task to a dynamic activity. He’s clear-eyed about the purpose of the process. And when he talks about lessons learned, he becomes animated, drawing out words for emphasis, in his native Virginian drawl.

“The key thing about lessons learned is that you have to put them in a context where they arevisible and actionable. They can’t feel like a beating. And they can’t be so wispy as to be ignored. That’s the magic.

“If you think about all the reviews we do at Wallops, when we do a launch readiness review we generally have the same agenda whether we’re doing a Minotaur or a sounding rocket or whatever. It’s all the same stuff; it’s just a question of scale.

In fact, we do exactly what Vandenberg does, exactly what the Eastern Range [Cape Canaveral] and everybody else does. What we do for lessons learned is, we categorize the lessons and actually stick them in a bucket that corresponds directly to a topic that has to be addressed at the major reviews. And we put that in the hands of the project managers—and also in the hands of the reviewers.

That’s one of the best things about this job: … to be responsible not only for watching over these projects as they get to completion but then, at the end of a mission, to get back in and sort of push out all the experiences to the other project managers in such a way that everybody gets better.

“So our review panel for TacSat-2 came in not only with the materials that they were going to review, but with very specific lessons learned about each one of the areas. And what happens is, there begins to be a dynamic between the project managers and the review panelists. So just by allocating lessons learned in this way, we ensure a personal dynamic is going to occur between the manager and the panel.”

But that’s just the beginning, says Pittman.

“There are ripples to this that are even more important,” he continues, “because that’s just how we get it through the review—and how you get it through the review is nothing compared to what you really need to be doing to do the work. Now we’ve created a process where the project team says, ‘Geez, why are you doing it this way?’ and the project manager says, ‘Well, I knew you were going to ask this. I don’t want to see us not learn this lesson.’ So now the project team members start to anticipate that the project managers are sensitized to these things, and they start doing them.

That’s the theme. That’s the process. We’ve become almost obsessed with this idea that we’re going to proceduralize everything.”

To make the magic work—to really make it “actionable”—the trick is to make the lessons learned applicable.

“The real problem,” says Pittman, “is crunching down the relevant stuff and putting it in front of people and making it relevant to their jobs. When you do it like this, you have just vast re-use of best practices. And people become very sensitized to things that didn’t work, and the next time they say, ‘We’re never doing that again!’ You’ve sort of made it a stepping stone on a path that they normally walk.

“And when you do that, then you’ve achieved something.”

In his office back at the Wallops main base, Pittman scrolls through screen after computer screen of lessons learned inputs and reports for the TacSat-2 mission. The culmination of all this information is, among other materials, a 225-page presentation-style compendium of lessons learned. The document begins with a bar-chart summary of findings in nearly forty categories, from testing and countdown to range instrumentation, through mishap plan, budget, decision authority, and ground systems to safety, security, requirements, facilities, waivers, and so on. It includes both a summary of major trends and a detailed report for each of the categories. Each detailed report in turn has a lesson statement, an impact statement, a recommended action, and a response from the range and mission management office.

“Look at this!” Pittman says, staring intently at the screen. “We even learned stuff about waivers. There’s one waiver process that was so broken that we finished TacSat, and the day after we started the waiver process for NFIRE because it was just so whacked. Here’s the data behind all that.

“Or look at this. We didn’t have a good line of sight to the launchpad. It was obscured. So we put that into a category that would be applied to a review and recommended actions, then the team turned it into actions and we fixed the problem [for the NFIRE launch]. In fact, most of these were fixed sitting right here, when I’d call somebody in and say, ‘Apply some of your budget to fixing that problem.’ And it goes away. Ultimately, there were more than 200 of these that we then rolled into about fifty overall lessons.”

In all this enthusiasm for the process, it’s clear that Pittman takes particular pleasure in the democratic inclusiveness of the procedure: “We pride ourselves on the fact that we get lessons learned from everywhere. We get them from radar operators and security guards—those are the people who tell us, ‘You know what, you guys, this looked good in the review but it didn’t work on launch.’ And then we had to do this and that and the other thing.

“We took the [TacSat-2] launch team, put them in a room, and looked at how many lessons we got from the team. Are there any groups of people that we got no lessons from? Surely it wasn’t perfect in Security—where are our inputs from Security? And right on down the line.”

Transparent. Relevant and applicable. Not wispy, but not burdensome. On the Wallops range, the magic of lessons learned works. “On NFIRE,” Pittman says with some pride, “we got compliments from our external review party about the constant reference in the second mission to the TacSat lessons learned. And something else—there was a constant reference to lessons learned from past missions as well. A lot of what we did on NFIRE and TacSat, we did because we knew it to be the right thing for a sounding rocket.”

Now Pittman is off and running. With thousands of Wallops missions as a reference point, the Range and Mission Management Office chief is just warming up to the subject.

About the Author

 Charles Tucker Charles Tucker works with Dr. Edward W. Rogers, chief knowledge officer at Goddard Space Flight Center, on organizational learning and knowledge management initiatives using case studies of Goddard and other NASA missions.

About the Author

Share With Your Colleagues