热度 21
2011-5-8 17:41
1870 次阅读|
0 个评论
The previous article about " needing them stinkin' requirements " generated quite a bit of response. I recommended Karl Wiegers' book " Software Requirements " as a great introduction to the art and science of gathering – and, as one poster noted – analyzing requirements. But there is a five second "elevator talk" version. Some years ago, at a conference in Mexico, I attended a talk Steve Tockey gave about requirements. He very ably and succinctly summed up the rules for requirements, which I'll paraphrase here: - A requirement is a statement about the system that is unambiguous. There's only one way it can be interpreted, and the idea is expressed clearly so all of the stakeholders understand it. - A requirement is binding. The customer is willing to pay for it, and will not accept the system without it. - A requirement is testable. It's possible to show that the system either does or does not comply with the statement. The last constraint shows how many so-called requirements are merely vague and useless statements that should be pruned from the requirements document. For instance, "the machine shall be fast" is not testable, and is therefore not a requirement. Neither is any unmeasurable statement about user-friendliness or maintainability. An interesting corollary is that reliability is, at the very least, a difficult concept to wrap into the requirements since "bug-free" or any other proof-of-a-negative is hard or impossible to measure. In high-reliability applications it's common to measure the software engineering process itself, and to buttress the odds of correctness by using safe languages, avoiding unqualified SOUP , or even to use formal methods (actually, the latter is not a common practice, but is one that has gained some traction). What do you think? Does your group do a good job eliciting and analyzing requirements?