热度 20
2011-3-13 18:33
2015 次阅读|
0 个评论
"You never know..." Many years ago, NASA distributed a space trajectory simulation called the N-body Program. It quickly became the standard for lunar and interplanetary studies and enjoyed a well-earned reputation for excellence among its users. Around 1963, NASA assigned a programmer at Goddard Space Flight Center to rewrite the program using the more structured constructs of Fortran IV. I was among the first to see a copy of the "new, improved" program. I was stunned. Oh, the program had modules, alright. But the calling lists were completely absent. When he set out to rewrite the program, the new guy had what was, to him, a Good Idea. Instead of the nicely structured arrangement of the original program, he'd take every single variable in the program and give it global scope. I said, "Um, it appears that you've moved all the variables to COMMON." "Yes," he acknowledged proudly. "Why would you do that?" I asked. He explained, "You never know when you might want to print something out." I thought that surely this resoundingly Bad Idea was an anomaly, not one I'd be likely to see again. But I was wrong. It's surprising—also depressing—how often the idea re-emerges. Only a few years ago, I ran across another Fortran simulation, a much-used program written by a highly experienced Ph.D. This guy not only had the same idea, but compounded it with another, even more resoundingly bad one. Dr. Seuss wrote the book, Too Many Daves . It describes "Mrs. McCave, who had 23 sons, and named them all Dave." It goes on to say: "And often she wishes that, when they were born, She had named one of them Bodkin Van Horn, And one of them Hoos-Foos. And one of them Snimm. And one of them Hot-Shot. And one Sunny Jim. And one of them Shadrack. And one of them Blinkey. And one of them Stuffy. And one of them Stinkey. Another one Putt-Putt. Another one Moon Face. Another one Marvin O'Gravel Balloon Face. And one of them Ziggy. And one Soggy Muff. One Buffalo Bill. And one Biffalo Buff. And one of them Sneepy. And one Weepy Weed. And one Paris Garters. And one Harris Tweed. And one of them Sir Michael Carmichael Zutt, And one of them Oliver Boliver Butt And one of them Zanzibar Buck-Buck McFate... But she didn't do it. And now it's too late." My colleague did something shocking similar, only it wasn't nearly as funny. He must have thought, "Here's a great idea: Let's move all 60,000 of the program's variables into global COMMON, and name them all x ." This is not a joke. I'm serious. He did exactly that. More precisely, he defined: Common x(60000) Then he equated elements of x to the real variables, so that: x(1) was really BodkinVanHorn , x(2) Hoos_Foos , etc. To make all of this work, he developed a preprocessor that would allow him to write his code in the usual way, giving the variables names with mnemonic significance. As it read the source file, the preprocessor assigned the user's variables to a unique element of x . Then it wrote a new source file, with every reference to every variable replaced by the equivalent element of x . The preprocessor also wrote a symbol table file, which my pal used later. He built a DOS batch file that moved the appropriate files around. Why would he jump through all these hoops, just for the chance to name all of his variables dave? I wondered too, so I asked him. He replied: "You never know when you might want to print something out." Sigh. I've thought about this program many times since. Haven't we seen other development tools that take names with mnemonic significance and assign array addresses to them? And don't they also build symbol tables? Of course we have, and they do. They're called assemblers and compilers. Think about it. IBM's original assembler wasn't called Symbolic Assembler Program (SAP) for nothing. A large part of its job was to replace both instruction mnemoics and user variables with numeric values—opcodes for the mnemonics, RAM addresses for the data variables. Compilers do something similar. In effect, this guy's preprocessor duplicated the work of the compiler, building its own symbol table and assigning each variable to a slot in the x array. Then his batch file would pass the new, "intermediate" source file to the Fortran compiler, which would do exactly the same thing, assigning every element of x to a location in RAM. Such a deal. How could anyone have ever thought this was a Good Idea? It's because it satisfied his need to "print out" any variable in the program. He had written an interactive tool that ran in concert with the simulation program. Using the symbol table his preprocessor had built, he could select a handful (10 or 12, as I recall) of the 60,000 elements and display them in little text boxes. Think the Display block of a Simulink program, and you won't be too far from the idea. If he chose, he could pause the program, single step, and switch the selected variables for new ones. For a production program, I can't see much value in sitting at my desk, watching the selected outputs spin by like odometer readings, usually too fast to read. I suppose my pal could look for special events, such as the altitude going through zero, but other than that he'd only be able to see, in "real time," the coursest of trends. Looking back, I think he used the display tool mostly for debugging. In effect, the interactive display tool served as the symbolic debugger that the compiler vendor hadn't provided. Having all the program variables in a single, global array meant that my pal didn't have to depend on the compiler to give him the RAM location of each variable. But I'm still not convinced that this approach gives much bang for the buck. When I'm debugging, I need to see the exact numeric values of the variables of interest, so I compare them with the values predicted by hand calculations. I can get those values with the primitive mechanisms called debug prints. When I'm debugging simulations these days, I like to use graphs. A picture is worth a thousand words, and although I can't measure exact values from a graph, I can at least see that things are going in the right direction, and I can see it much better than if I watch display dials spinning by. But here again, I don't need a special tool to get the graphs. All I need to do is write the selected variables to a file, then use an offline graphing tool to draw the pictures. Matlab can do this, as can Mathcad or Excel. In the end, this guy's desire to be able to "print out" any variable he chose, and watch it evolve before his eyes as the program ran, completed dominated the design of his program. To achieve this end, he was willing to completely destroy the structural integrity of his program, all the while creating tons of gratuitous complexity, generating two major applications of dubious value, and violating the KISS principle all to heck. He was very, very proud of his Great Idea and his accomplishments in getting it all to work. I, in contrast, though it was a monumentally Bad Idea, squared. Needless to say, disagreements ensued. Of all the mistakes one can make, holding onto a bad idea just because it's your bad idea has got to be the worst idea of all.