CONNECTING WITH QUALITY

Placing the Quality Bet

by Alan S. Koch, PMP


Abstract
Most of us have heard the logic: Invest in quality early in the project, and it will pay back later. Specifically, if we take the time to do peer reviews of designs and code, we will save even more time during testing.

But when it's time to plan a new project, there is always a reason why that logic just doesn't seem to apply. The schedule is too tight. The customer's demands are too large. The project team is not ready. Many things conspire together to convince us that although reviews are a good idea, we just can't afford to do them this time.

Regardless of how compelling they seem to us, these "reasons" are illusions. And the only way to prove it to ourselves is to sit down at the table, and place our bet on reviews.

Recognizing the Illusion

We would love to do the "right" thing quality-wise, but there never seems to be a good opportunity to try. Where are we to find the time to add reviews to the already ridiculously tight schedule? There is already too much work and too little time. Adding yet another task seems impossible.

But this lack of available time for reviews is an illusion. The time is right there in our schedule, begging for us to see it. Where is it? If we look at the data that has been published, we can plainly see that spending the time for early reviews does indeed save more time than we will spend. In other words, the time we need to do reviews is already in the schedule -- under the heading of "Testing."

That's right! If we need a week to do design reviews, we can just steal it from the testing phase! Need another week for code reviews? Take that one from testing as well. By doing this, we have added reviews to our plan without having any effect on our end date at all!

Delaying the Coding Work?!?

Wait a minute! Now we are talking about delaying coding while we review the designs. How can delaying the central work of the project possibly be a good thing?

As counter-intuitive as this seems, delaying coding is a smart thing. If we normally dive directly from design into coding, then we are coding the system based on defective designs. Sometimes we discover the design defects while coding, and the impact of those problems is some rework of the code. But many of those design problems make it into testing, and when they do, fixing them usually means ripping the code apart and rebuilding whole sections of it.

A design review will enable us to find and remove most of the design defects before our programmers write any code! That means they can avoid much of the usual rework, since they are not coding to defective designs. How much? Organizations taking this approach have reported that they avoid more hours of rework than they spend on the reviews.** In other words, there will be a net schedule gain from the design reviews: some of it during coding, and the majority of it during test.

Yes, delaying coding so we can review the designs will have a positive impact on our schedule!

Delaying Testing?!?

But coding isn't the only thing we are talking about delaying. We are talking about delaying testing even more so we can review the code! There is never enough time to fully test the product, and we are talking about starting test even later than we usually do! Testing is critical to the quality of our product. How can it be smart to start testing so much later?

Consider, though, why testing takes so long. What do we spend most of our testing time doing? If all we did during testing was to run each test once, it wouldn't take very long at all. But instead, we spend most of our time stomping on the bugs.

A tester stumbles across a problem. Repeats it. Documents it. And manages the problem report. A developer investigates it. Reproduces it. Investigates more. Runs the debugger. Reads the code. Fixes the problem. Tries it out. Fixes it more. Force-fits some new code into the system. Re-designs. Is overjoyed to discover it works. And marks the defect report "fixed". The tester then retests the problem to confirm that it was fixed. Often it is fixed (close the defect report). But sometimes it is not (start all over again)!

Reviewing the code before we begin testing allows us to remove most of the defects before testing begins. Plus, it is easier to fix defects found in reviews than it is to fix the same defects found in test.

We just talked about all of the effort that goes into documenting, diagnosing and fixing defects found during test. Most of the effort involved is caused by the fact that, generally, a test does not tell us what is wrong with the code; it only discovers a symptom. Much of the work described above is then directed toward diagnosing the symptom to discover the actual problem. In contrast, a review finds exactly what is wrong with the code. No diagnosis and little investigation are needed. The programmer can see immediately what is wrong, and can move directly into correcting the problem, saving significant time on most of the defects.

Yes, delaying testing so we can review the code will also have a positive impact on our schedule!

Short-Changing Testing

Even if the economics tell us that it makes more sense to have the reviews, how can we justify further eroding our testing time?

On most projects, we have less time available than we need. Therefore, we must make compromises in order to get the project done by its deadline. The smartest way to make those compromises is to focus our time on the highest-value activities; giving priority to the highest-value activities, even if it means cutting back on the less valuable ones.

It is critically important that we assess the quality of the products we are building and remove the defects we find. Testing and reviews are both methods for doing this. So in order to decide which of them should receive priority, we must look at the relative value of each.

We can count the number of defects that our team finds and fixes in testing and in reviews, and we can count the number of engineer-hours that they spend in those activities (including the finding, the fixing, and the re-checking to be sure the fix worked). The activities that remove more defects per engineer-hour spent are higher-value than the others.

Organizations that have made these measurements uniformly report that reviews remove far more defects per engineer-hour than testing.** In fact, the efficiency of reviews is measured in "defects per hour," while testing efficiency is measured in "hours per defect"! That is, reviews result in multiple defects being found and fixed for each engineer-hour spent, while multiple hours of engineers' time are required to find and fix the average defect found during test. This difference is significant. So if we must cut corners on quality-related activities, better to cut them in the lower-value testing than in the high-value reviews.

Placing Your Bet

This is all very nice and logical, but in order to find out if it works on your projects, you must take a chance on reviews. It is time for you to take your seat at the table and place your bet. Take a deep breath, place a stack of engineer-hour chips on "Peer Reviews," and let the dice roll! Many project managers before you have placed the same bet, and found that it pays off nicely.

Changing your approach to quality can feel like a huge gamble. But when you follow the lead of so many others who have succeeded, you will find that you are not really playing a game of chance. In fact rather than a roll of the dice, it will be more like a card game -- one where you have been counting the cards, and you know that the next one up will complete your royal flush.

That ace is right there on the top of the deck. Get in the game, because that is the only way to win!


** Note: See, for example, "Advances in Software Inspections," by Michael Fagan, which cites several examples of data reported by companies that employ Software inspections. (IEEE Transactions On Software Engineering, July 1986). Also worth reading are SEI's page on Software inspections and the dozens of articles and papers included in their references.




©Copyright 2000-2017 Emprend, Inc. All Rights Reserved.
About us   Site Map   View current sponsorship opportunities (PDF)
Contact us for more information or e-mail info@projectconnections.com
Terms of Service and Privacy Policy