Back to Top
Implementation Reviews

By Gerald Murphy

The biggest challenge in managing science instrument development or any new technology development for that matter is trying to get the project completed on schedule for the money you have. Few project managers accomplish that, despite what they might tell you.

Implementation Reviews 2a

Solar Wind Ions Mass
Spectrometer

It just doesn’t happen, and it’s easy to understand why: technology development doesn’t have a predictable path. You haven’t built this thing before so how the heck do you know how much it’s going to cost, and, besides, you can’t foresee all the problems you’ll run up against. You know the result you want and you declare success when you are “close enough.” In short, the job must be “dynamically” managed.

When I worked on the Advanced Composition Explorer (ACE) project, we needed to produce five instruments that were either entirely new or were considerably modified from earlier models. These were each $8-10 million instruments. All of them were what I would call technically risky in one way or another—some in several ways.

I had never been on a project before where this was done, but it turned out to be the single most valuable review we had from the point of project success.

Some of our problems early in the project derived from not understanding exactly what the instruments were intended to do (what was going to be good enough), and not knowing what we could do to help the university-based teams in building them. We in the payload management office took the approach of asking each team, “What do you need in order to get your job done, and how can we make that happen?”

As a cure for this problem, one of the things that we decided to do was to have Implementation Reviews. I had never been on a project before where this was done, but it turned out to be the single most valuable review we had from the point of project success.

Typically, reviews are design-focused. In point of fact, many of a project’s problems are not due to design flaws. They are due to implementation flaws—if the implementation doesn’t have a good “design,” it will not be executed smoothly.

Implementation Reviews 3a

Solar Ions Spectrometer

When I use the word “implementation,” I mean it in the broadest sense: implementation of the design and manufacture of the instrument. And I don’t just mean taking a look at schedules and money; I mean looking to see, as well, if you have the right team, if the team is assembled in such a way that the lines of responsibility make sense, if the interfaces are clear and easily defined.

Do you have margin for error? Where are the technical risk items and what is your plan to deal with them? Who is responsible for what? How many engineers do you have on this job and do they have the right experience? Oh, you have five engineers? Well, I only see three engineers in the room; where are the other two? “Well, they actually work for Joe Blow, a scientist down the hall. Joe has promised me that a year from now, when I need the engineers, I can have them.” Yeah, right, but what happens if Joe decides he needs them in a year? They actually work for him, right?

Here’s another example: On one project, an instrument team partners with a team from the European Space Agency (ESA). A foreign scientist there tells his American counterpart, “I can give you an electronics board or part of your detector system and it will do all these wonderful things, and you won’t have to pay for it out of NASA’s budget because ESA will pay for it.” The American scientist says: “That’s great; that makes my instrument cost a half million dollars less.”

But what happens, a little way down the line, when ESA is a little slow to fund its part of the project, or erratic currency exchange rates cause a financing problem or a new tariff regulation prevents the transfer of technology from one side of the Atlantic Ocean to the other? These are examples of implementation questions. It may still turn out that having ESA supplement the program is the right thing to do, but you have to ask the questions.

The point of the implementation review is to prevent problems from occurring later by trying to get our arms around the planning from the start. Our discussion might go like this: “Well, look, instead of having that scientist across the ocean be solely responsible for delivering this critical element, maybe we can find some other way to get it.” Or we might decide to fly there and observe first-hand how well our counterparts are doing, and if there is something we can do to help assure success.

The size and composition of review teams were tailored to the place we went.

For the ACE project, we traveled around to each partnering institution. The process took several months because we would camp out onsite for three days. We sat around the table together, listened to presentations and figured out how we were going to get the instrument built and delivered. We found the holes and looked for ways (together) to plug them. We tried not to be optimistic and fool ourselves. The size and composition of review teams were tailored to the places we went. It was always tricky putting together just the right team, but Al Frandsen, our payload manager, was good at that and we managed to find the expertise that we needed.

Implementation Reviews 4a

Solar Wind Spectrometer

The review teams turned out to be between five to eight people, a balance across the different disciplines, and they included the payload group (i.e. Al Frandsen, Howard Eyerly, our Reliability and Quality Assurance Manager and me). Say we knew that a team was having a problem making their detector meet launch load requirements. We would grab somebody from JPL who could solve that in a week instead of letting the instrument team spin their wheels for six months. In addition, we would typically bring someone from Goddard who was good at understanding resources and estimating actual costs.

The implementation review happened only once at each site, but it was a big deal. I would say it was the most important thing we did to enable ACE to deliver on schedule and within budget, because we recognized and dealt with potential problems before they became unmanageable and costly.

Since then I have seen several projects that would have benefited from this review. It is important that it take place at the right time. You have to understand your requirements, your schedule, and your other resource constraints. You also have to understand where you have flexibility. If the review is too early, it is not beneficial; if it is too late, you are buried in trying to solve the problems of the day instead of being ahead of the wave.

Implementation reviews do one other thing. They set the tone for management of the project. They establish a teaming relationship (if they are run properly), and they level the playing field instead of setting up turf wars.

 

About the Author

 Gerry Murphy Gerald (Gerry) Murphy founded Design Net Engineering in 1996 as a network of senior consulting engineers serving as “problem solvers” for NASA missions. The network evolved into an aerospace hardware/software development company providing design and fabrication service to businesses, universities, and government agencies. About his world since leaving NASA, he says, “Managing with flexibility is still my paradigm. In fact, it works in small business environments even better than when you are managing the standard NASA project. In either case, the ground under your feet is always in motion, and fog in the road up ahead never quite ‘clears.'”

About the Author

Share With Your Colleagues