By Dr. Gerald Mulenburg
The email was addressed not only to me, but also to all the Project Knowledge Sharing Community at Ames Research Center. We were invited to sit in on a major project review as a new experiment in knowledge sharing. This first-of-its-kind opportunity had been conceived by Claire Smith, who leads the knowledge sharing program, as well as heading up the Center’s Project Leadership Development Program and serving as coordinator of the APPL-West program at Ames.
The objective was to offer Ames project practitioners the opportunity to observe project-review processes as they happen. Not that I haven’t participated in my share of project reviews, but this seemed like a great way for me to get up-to-date about a new project, the Kepler mission, and to experience a review from a new perspective.
Typically, when you’re being reviewed, it’s difficult to see what’s happening objectively — the same way it is on a project. Presenters are always thinking, “Okay, what’s on my slides? How much time do I have left? What are they going to ask me?” So when Claire’s email pinged on my computer, I quickly responded by asking her to save a place for me.
It was to be an informational review about progress on the project: what the team had done, where they were going, and what they needed to do to get there. There were people on the project team from all over the United States, and it was the first time for them to get together from all aspects of the project.
For our part, as observers, we were asked to abide by a couple of rules: Don’t ask any questions, and don’t talk about the specifics of what we saw or heard. The idea was that we weren’t supposed to be noticed. We weren’t to buzz around and bother people. Hence the name for this experiment: Fly on the Wall.
I got there early because I wanted to find a seat without disturbing anyone. By the time the review got underway, there were probably about fifty people in the room. The main members of the review board were at a large table. Subject Matter Experts on the project were seated in three rows on one side of the table. Many of the people in the room never spoke. Some of them could have been observers like me, but I don’t know that. I may have been the only one to observe, although I hope not. It was a remarkable experience.
Project Manager Chet Sasaki from JPL kicked off the review by introducing the Deputy Project Manager, Larry Webster from Ames, and the Principle Investigator, William Borucki, also from Ames. I was impressed right away that the meeting started on time, and then stayed on time, even finishing a little ahead of schedule for the morning session that I attended — this in spite of a lot of discussion of the management strategies and who has what responsibility.
One thing that they did that I thought was unique was that one person at the table was appointed to be an ombudsman. His job was to cut off discussion when it was more appropriate to take a conversation off-line. In the past, I had seen people take on a role like that at reviews out of frustration, but in this case it was a designated role. This person cut off discussion several times during the meeting. This was done in a polite, professional manner, often at the request of one of the participants in the discussion, and it worked quite well. Everyone deferred to that person’s judgment when it was time to move on, and that was an important reason the meeting kept on schedule.
The review was structured well, too. No fewer than eight separate functional organizations across the U.S. who have integral roles in the project attended the meeting including Mission Operations, the Science Office, Mission Management, Flight Planning Center, Flight Control Center, Science Processing Center, Data Management Center, and the Deep Space Mission System. Representatives from participating international organizations also attended.
One person ran the presentations, handling the transition from one presenter to the next, and actually taking part in the presentations. Most of the reviews that I’ve been involved in have been just a constant series of slides, somebody talking for awhile and then moving on to the next person, and so on, and so on. The Kepler review was much more interactive. They stopped after every presentation and said, “Okay, time for questions.” Each time a question was asked, it was decided immediately whether the question was appropriate for the entire group and, on several occasions, a question was deferred to be taken up later by a relevant group. In addition, the ombudsman had to announce a couple times, “Okay, it’s time to move on.”
I’ve been in a lot of meetings where discussions have spiraled out of control and, before you know it, you’re way behind schedule. These folks had their agenda down precisely. I told the facilitator afterwards how impressed I was at what they’d accomplished in the four hours I was there and how smoothly it went. He said, “Yes, but you didn’t see the pre-runs that we did before we came into the room, and the things we cut out that we felt we could do away with.” So, they had done an excellent job in their preparation to make sure that everything fit in the time available.
The only thing I found that didn’t work well was a minor set-up detail. They had used pushpins to hang huge sheets of paper with diagrams and information on the walls. With the seats arranged as they were along the wall, people pushed their chairs back into these charts and they started to fall down. It got to be a little annoying because it was noisy and it disturbed the people presenting, although they did their best to ignore the distraction. A couple of us grabbed a handful of additional pins and fastened down the sheets when we saw this was going to continue to be a problem.
I was under one particularly defiant chart that kept falling on me. Except for that, being a fly on the wall was a safe experience. Seriously, it provided an interesting perspective of the presentations going on and the interactions in the room. I believe Claire has hit on a simple but extremely valuable knowledge-sharing technique that can be easily duplicated with other projects at other centers.
As a matter of fact, I think it should be required that senior managers make their younger managers observe a review like this before they find themselves on the hot seat. By simply listening, there’s so much to learn from what’s going on. In addition, capturing some of the tips from observers and sharing them with project managers and teams across the agency might be another high-potential outcome of a Fly on the Wall.
My hat is off to the Kepler Mission team for their thoroughness, professionalism, and focus, but also for their cooperation in this helpful and important experiment.
Tips from the Kepler review
Some useful practices that I picked up from being an observer:
- Not introducing everyone in the room at the beginning of a meeting, but sticking to the key players at the table. Participants who gave parts of the presentation introduced themselves at that point, and others gave brief introductions when they contributed to the discussion related to their specialty. Some of these people were high-level representatives from other government agencies and participating companies who didn’t seem to mind not being introduced initially.
- Clearly stating the purpose of the meeting at the beginning, and even more importantly, clearly stating what the meeting “was not.” This set the stage for the efficiency of the meeting, and I am sure reduced the number of distracting comments that often come up in these types of meetings.
- Assigning someone at the table (strong enough to do it) as an ombudsman to cut off discussion when it would be part of a later presentation (not relevant now), wasn’t contributing much (those who love their own voice), or was more appropriate for off-line conversations (people who just can’t let go, but might have something important to say).
- Posting a roles-and-responsibilities matrix with the key organizations across the top as column headings, and functional elements in rows down the left-hand column. The intersecting row-column blocks in the matrix clearly stated which organizations were responsible for what in each of the functions.
Read more about reviews:
Implementation Reviews