热度 12
2012-12-27 20:38
1596 次阅读|
0 个评论
As an editor for Embedded.com, I read a number of books about computer systems design and development. Usually they fall into two categories. First there are the ones that fulfil their immediate goal as useful guides for embedded systems developers in accomplishing their tasks. They have a lifetime that at best lasts as long as the systems they describe. While their information is current, I refer to them as long as they are useful in my job. Second are those books – such as "Algorithmics: The Spirit of Computing" by David Harel and "The psychology of computer programming " by Gerald Weinberg—which go beyond the specifics of particular systems and also give me insights into how to think about problems, not just those relating to computer systems, but any system – social, behavioural, economic, behavioural, environmental and biological. They are also the books that have information and insights that are timeless and to which I return again and again to reread and think about. When "Why programs fail: a guide to systematic debugging," by Andreas Zeller first came out I found it an extremely useful book in the first category. If that is far as you go with it, it will be a very useful professional resource, because of its emphasis on the use of the scientific method in finding and correcting software bugs. When I read it again recently in its 2009 revised and expanded second edition, my perception of it had moved it into the second category. For me, it remains a useful resource on how to think about software programming and debugging. But whether the author intended it, I am this time finding it useful as a guide to thinking about how to analyse and debug the "human biocomputers" upon which each of us depends. Unfortunately, our thinking processes are infected with dozens of habits and modes of analysis that derive from ways of thinking we developed tens or hundreds of thousands of years ago as hunter gathers. Appropriate then, they now play havoc with how we think about problems, and one of the most effective ways to deal with them is the rigorous use of the scientific method. The only chapter in the book that explicitly talks about the scientific method is Chapter 8 in which Zeller lays out the specifics of the scientific method and then applies it to each and every step of the debug process. That chapter alone would justify its purchase, but the spirit and application of the scientific method suffuses every nook and cranny of the text with chapters on finding, observing facts, deducing errors, investigating causes and effects, isolating cause and effect chains and learning from your mistakes. But how will what you learn from this book on more effective software debugging help you in learning to think more clearly and effectively? For me, the specificity of the examples in Zeller's book in the context of computer program design and debugging, has given me some clear guidelines to the use of the methodology even more effectively in all aspects of my mental life. And, boy, do we need such tools, no matter how mundane and commonplace the situation or far afield from its use in engineering and science. As cognitive scientists such as Dan Ariely, Daniel Kahneman, Gary Marcus, Keith Stanovich, and Richard Thaler have pointed out the default "fast thinking" mode so useful in dealing with problems we faced as hunter-gathers is utterly useless in the twenty-first century. If we do not take a step back and practice a more intentional and slower analytic approach as described by Kahneman in "Thinking: Fast and Slow, " a variety of cognitive biasesÿ kick in automatically, no matter how well trained we are. In his book "Rationality and the reflective mind," Stanovich outlines a number of "mindware" methods and tools that can be used to counteract the fast thinking biases hardwired into our brain. For many who have received training in engineering and sciences, such modes of thinking are familiar. But such hardwired biases are hard to displace and even harder to identify once you are in the midst of one of them, no matter how well trained you are in techniques of analytical and critical thinking. For example , in "The Signal and the noise ", a book by Nate Silver on application of probability and statistics to ordinary life, he chronicles the stories of several knowledgeable researchers in weather forecasting, earthquake research and economics who were led astray by such biases, even though they knew better. Thaler in "The Winner's Curse: Paradoxes Anomolies of Economic Life," addresses the damage such cognitive biases can cause in the complex financial transactions that led to the debacles that pushed the world economy into it's worst recession/depression since the 1930s. And if you want to see the damage such cognitive biases cause to the political process, tune in occasionally to CSPAN and observe the U.S. Congress in action. Looking at the chaos in our lives that such cognitive biases cause makes me wonder sometimes how the human race has survived so long. ( And when I think about the efforts being made to create computers that actually think like us I shudder .) The only way I can see to keep from contributing to such chaos is to apply in a dedicated way every mindware tool I can—including the scientific method—in every aspect of my life, even though it does not make me a nice guy to be around sometimes. Although it was not his intention, because the book gave me a detailed hands-on look at the use of the scientific method in a context within which I am very familiar ( embedded systems programming ), it has enhanced my understanding of how to use it effectively. So, no matter whether you read the book for what it is – as a resource that will contribute to your understanding of software debugging and analysis – or as a way to hardwire into your repertoire of mindware an effective antidote to the many thinking biases to which we are subjected, you will come away satisfied. I certainly am. So much so that I have downloaded Zeller's book to my Kindle E-reader to reread – or listen to – whenever I have the need.