原创 Skip bugging to speed delivery (Part 2)

2011-6-9 18:10 2236 9 9 分类: 消费电子

[continued from Skip bugging to speed delivery (Part 1)]

Clearly you can't measure defect potential unless you track every bug found during development, whether discovered in an inspection or when Joe is furiously debugging his code. Being just a count it's painless to track.


If your inspections aren't finding at least 60% of all of the bugs there's something wrong with the process. Tune as needed. 70% is excellent, and some companies achieve 80%.


(To put more numbers around the process, Capers Jones showed in his paper "Measuring Defect Potentials and Defect Removal Efficiency" (Crosstalk magazine) that the industry averages an 85% defect removal efficiency.3 At the usual 5 bugs injected per 100 LOC that means shipping with 7.5 bugs per KLOC. Achieve 60% efficiency just in inspections and figure on 4.5 per KLOC. Jones found that the lowest development costs occur at a defect removal efficiency of 95%, or 2.5 bugs/KLOC. According to a vast library of data he sent me, projects run at 95% cost nearly 50% less to develop than ones idling along at 85% yet they ship with three times fewer bugs. And he calls an efficiency of 75% "professional malpractice.")


Inspections are irrevocably tied to the use of firmware standards. If a team member complains about a stylistic issue, the moderator says: "does this conform to the standard?" If the answer is "yes," the discussion is immediately closed. If no, the proper response is a simple: "fix it." No debate, no expensive discussion. Don't use inspections and the team will surely drift away from the firmware standard. The two processes are truly synergistic.


Managers have an important role: stay away. Enable the process, provide support, demand compliance, but never use the inspections as a way to evaluate people. This is all about the code, and nothing else.


Managers must always ensure the team uses inspections and all of the other processes even when the delivery date looms near. Can you imagine getting on a plane, walking in the door, only to overhear the pilot say to the co-pilot: "We're running late today; let's skip the pre-takeoff checklist." What amateurs! Management can't let the threat of a schedule dismember proper processes... which, after all, shorten delivery times.


Inspections inspire plenty of debate, but few who have looked at the data doubt their importance. In today's agile world some advocate a less formal approach, though all buy into the many-brains-is-better-than-one theory.

The agile community buys into inspections. As Kent Beck wrote in eXtreme Programming Explained: "When I first articulated XP, I had the mental image of knobs on a control board.5 Each knob was a practice that from experience I knew worked well. I would turn all the knobs up to 10 and see what happened. I was a little surprised to find that the whole package of practices was stable, predictable, and flexible." One of those knobs is code inspections, embodied in XP's pair programming.


The most complete work on inspections is Software Inspection, by Tom Gilb and Dorothy Graham.6 It's a great cure for insomnia. Or consider Peer Reviews in Software, by Karl E. Wiegers.7 Written in an engaging style, rather like the Microsoft Press books, it's an accessible introduction to all forms of inspections, covering more than the traditional Fagan versions. One of my favorite books about inspections is a free one available from SmartBear Software called Best Kept Secrets of Peer Code Review. 8

Then there's Software Inspection: An Industry Best Practice by Bill Brykczynski (Editor), Reginald N., Jr. Meeson (Editor), David A. Wheeler (Editor).9 Unhappily it's out of print but used copies are usually available on Amazon. The book is a collection of papers about successful and failed inspections. But it, but don't read it. When I can't stand the thought of yet another inspection I pull it off the shelf and read one paper at random. Like a tent revival meeting it helps me get back on the straight and narrow path.


Inspections yield better code for less money. You'd think anything offering that promise would be a natural part of everyone's development strategy. But few embedded groups use any sort of disciplined inspection with regularity. The usual approach is to crank some code, get it to compile (turning off those annoying warning messages) and load the debugger. Is it any surprise projects run late and get delivered loaded with bugs?


But consider this: why do we believe (cue the choir of angels before uttering these sacred words) open-source software is so good? Because with enough eyes, all bugs are shallow.


Oddly, many of us believe this open-source mantra with religious fervor but reject it in our own work.


Endnotes:

1. Poppendieck , Mary and Tom. Lean Software Development. Addison-Wesley, 2003.

2. Fagan, Michael, "Design and Code Inspections to Reduce Errors in Program Development," IBM Systems Journal, Vol. 15, No. 3, 1976, available on-line at www.research.ibm.com/journal/sj/153/ibmsj1503C.pdf.

3. Jones, Capers. "Measuring Defect Potentials and Defect Removal Efficiency," Crosstalk magazine, June, 2008, www.stsc.hill.af.mil/crosstalk/2008/06/0806Jones.html,

4. Cohen, Jason. "Four ways to a Practical Code Review: How to almost get kicked out of a meeting," Smart Bear Software, www.methodsandtools.com/archive/archive.php?id=66

5. Beck, Kent. eXtreme Programming Explained: Embrace Change, Addison-Wesley Professional, 1999.

6. Gilb, Tom and Dorothy Graham. Software Inspection, Addison-Wesley, 1993, ISBN 978-0201631814.

7. Wiegers , Karl E. Peer Reviews in Software, Addison-Wesley, 2001, ISBN 978-0201734850.

8. SmartBear Software. Best Kept Secrets of Peer Code Review, available for free online at www.smartbearsoftware.com.

9. Brykczynski, Bill, Reginald N., Jr. Meeson, and David A. Wheeler (Editors). Software Inspection: An Industry Best Practice, IEEE Press, 1996, ISBN 978-0818673405.

文章评论0条评论)

登录后参与讨论
我要评论
0
9
关闭 站长推荐上一条 /2 下一条