tag 标签: ada

相关博文
  • 热度 15
    2015-7-23 22:14
    1696 次阅读|
    0 个评论
    Robert Dewar passed away last month, June 30 at age 70.   Most readers of this site probably don’t know the name. Dr. Dewar was president and one of the founders of AdaCore, a leading supplier of support and services for Ada developers. He had a long career and contributed to many projects. He and Edmond Schonberg wrote GNAT, the FOSS Ada compiler which is part of the GNU compiler collection.   Unfortunately, I only knew Dr. Dewar peripherally but had marveled at his intellect. (AdaCore is swimming in really smart people.) According to his Wikipedia page his interests were quite varied, and he even starred in the titled role in The Mikado (one of my favorite theater events).   According to a recent AdaCore newsletter most of the company’s business is in the embedded space. Yet, as I mentioned, probably few readers of this, the biggest embedded systems site, know of Dr. Dewar or have much knowledge of Ada.   Most of us use C or C++, a language that now has ancient roots. Ada is also old, dating to the early 80s, and has been through a number of revisions. In my opinion, we, the embedded community, haven’t given the language a fair chance. Only a few percent of us report using it in real products.   C is great; it’s a very compact way of expressing complex ideas. (One Ada user complained to me that C is for people who don’t like to type!) If you grew up in the early embedded days where everything was done in assembly, that transition to C was like a breath of fresh air. Now we could do in one line what might take pages using mnemonics.   C was properly designed for machines with limited memory and CPU cycles, since a 4 MHz clock was unheard of when it first appeared. But we’re still paying for those design choices. Divide by zero all day long. Exceed array bounds with gusto. If malloc() fails, well, who cares? Who even checks its return value?   Ada was designed to help us write correct code. It’s demanding, accepting only valid, strict syntax. With C one can practically compile the telephone directory. Most programmers hate Ada after their first experience with it. Those who soldier on through a few projects usually love it.   C developers have some really good reasons to eschew Ada. After all, we may be working on projects that already have massive amounts of C code. In many cases it’s a poor business decision to change languages in the face of enormous piles of legacy code. And it’s really hard to find people versed in Ada.   There are bad reasons as well for discounting Ada. People complain that it’s slow or generates bloated code. Yet that simply isn’t true anymore. Estimates peg the Ada tax at under 10% compared with C. I often respond with a question: what’s the difference in code size or execution speed between, say, IAR’s C compiler and GCC? Or between Arm’s and Green Hills? It’s probably a dumb question in that scores will vary depending on which feature you’re using; trig may be better on one product while memory management better on another. The differenc e is likely in the noise, which is pretty much what we can say about C vs Ada. Others bemoan the increased development time of Ada projects. There’s some truth to that, primarily because so many are safety-critical with an extensive certification process. Remove that and things change. Ada programs experience about a tenth the number of bugs of C systems ( see Table 1 in Software Static Code Analysis Lessons Learned , Andy German, QinetiQ Ltd., Nov 2003, Crosstalk ). Since the average embedded team spends half the schedule in debugging, that argument is spurious.   Some dismiss the language because, as they say, it can’t be used on microcontrollers. Except it can: It’s available for a number of MCUs including the Cortex-M family and even for 8 bit AVRs.   The argument that Ada developers are a scarce resource is both true and important. It would be foolish to learn a new language while using it on a real product. In the 90’s and first part of this century I ran into a number of projects that failed when teams adopted C++ with no OO experience. One needs a chance to make mistakes, build practice throw-away projects. And Ada is complex. The John Barnes Ada 2005 book is over 800 pages. My recommendation is, as always, learn new things and push your mental boundaries. Robert Dewar spent a big part of his career working on and advocating for Ada. You’ll find it used where getting code right is critical. If you fly, thank his legacy for those routine trips that are never marred by an avionics glitch.
  • 热度 17
    2015-7-14 19:24
    1462 次阅读|
    0 个评论
    I am quite into reading traditional science fiction and fantasy books, but I also have a soft spot for graphic novels. In fact, I recently posted a blog that introduces five graphic novel series I'm currently reading.   One series I didn’t mention in that column is Alex + Ada by Jonathan Luna and Sara Vaughn. I decided to keep that one separate because it's more related to embedded systems with embedded vision and embedded speech. Also, I think the future presented in this series is much more likely to transpire and, indeed, is much closer than many might think.   Alex + Ada is set in the relatively close future. We start by meeting 27-year-old Alex, who is a pretty sad and dispirited character, if the truth be told. The first things we notice are the futuristic elements. Alex wakes up to be greeted by a holographic display showing the weather forecast and the news. In fact, it's from the news that we discover that an artificial intelligence (AI) became self-aware a year before. This AI uploaded itself into a hundred warehouse robots, which subsequently massacred any nearby humans. Not surprisingly, this has left people feeling pretty skittish, so even though humanoid robots have permeated society, their AI capabilities have been well and truly locked down.   Alex also has a "chip" implanted inside his head that lets him control and interface with the technology around. For example, he can to flush the toilet, make the coffee, and even call someone and talk to them “telepathically.” There are certain conventions in graphic novels, like using round "talk bubbles" with arrows to indicate people speaking locally, using jagged bubbles and/or jagged arrows to reflect people talking on the other end of the phone, and using talk bubbles with smaller bubbles instead of arrows to indicate thoughts. This is the first time I've seen a double-bubble (with smaller bubbles) used to indicate telepathic-like commands and communication.   It's not long before Alex gets a call from his grandmother who, it has to be said, is a somewhat frisky character. Grandma lives with an android called Daniel, who she describes as being "Kind and attentive and says all the right things, even if he needs a little direction." It doesn’t take long for us to come to the conclusion that Daniel's duties go a tad beyond making a hot cup of coffee.   Sensing that Alex is a little "down in the dumps," grandma decides to treat him to an X5, which is the latest and greatest in realistic robots. She even opts for the extra memory option (now, that's my sort of grandma).     Alex decides to name his android Ada. Initially, Ada's AI is locked-down, as is now the law for all androids. This means that she is totally subservient -- she will do anything Alex asks her to do, but any interaction between them is sort of "flat" because she will agree with pretty much anything he says.     After a while, Alex starts to realize that he wants "more" from Ada, although he's not exactly sure what "more" entails. He ends up wandering around the future incarnation of the Internet, and he eventually tracks down a private forum where he discovers that Ada can be unlocked. He also discovers that if Ada is unlocked, she will become fully self-aware and self-deterministic to the point where she may decide to leave him and go off on her own. You probably won’t be too surprised to hear that Alex opts to unlock Ada, one thing leads to another, and they eventually fall in love.     On the one hand, Alex + Ada follows a fairly classic science fiction story line. Having said this, it's very well written and it really makes you think about what it is to be human and the dilemmas one would face having totally subservient android servants running around the place. Also, although Sara Vaughn's illustrations are spare and unassuming, they very well match the style and pace of the story.   I'm reminded of the 2013 American romantic science fiction comedy-drama movie Her , which was written, directed, and produced by Spike Jonze, and which involves a lonely, introverted, depressed man called Theodore Twombly forming a relationship with a talking operating system (OS) with artificial intelligence, who names herself Samantha. More recently, we have the 2015 British science fiction thriller movie called Ex Machina , which was written and directed by Alex Garland, and which involves a programmer called Caleb forming a relationship with a humanoid robot named Ava.   Is all of this far-fetched? Well, as far back as 1996, a California company called Realdoll began making realistic, lifesized dolls. Since that time, they've sold thousands of these dolls for upwards of $10,000 apiece. The thing is that many of the people who purchase these dolls end up having what appear to be real feelings for them, as you can see in this BBC documentary that chronicles both the industry and the people who buy the dolls.   There are also numerous stories of younger people becoming emotionally attached to robot "pets." And there are documented cases of older people becoming attached to humanoid-looking bots with which they play therapeutic games, to the extent that the humans becomes depressed if they are prevented from interfacing with their bots.   There really is a lot to think about here. As one reviewer said: "After reading Alex + Ada , I'm really hoping my grandmother just gets me a jumper." Personally, I would love to have an Ada to help clean the house and do the laundry and have a cold beer waiting when I get back from the office in the evening, but I'm fairly sure that my wife (Gina the Gorgeous) would have her own opinion on this. How about you? How long do you think it will be before humanoid-looking robots are commonly available and affordable?
  • 热度 14
    2013-4-17 18:42
    1493 次阅读|
    0 个评论
    John McCormick, Frank Singhoff and Jerome Hugues' new book on Ada is specifically meant for firmware developers. It's called "Building Parallel, Embedded, and Real-Time Applications with Ada". This book does teach the Ada language, to a degree. But Ada newbies will find some of it baffling. The prose will clearly explain a code snippet but still leave the uninitiated puzzled. For instance, when this line is first used: with Ada.TextIO; use Ada.TextIO ; ... the first portion is well-explained, but one is left wondering about meaning of the apparent duplication of Ada.TextIO. The authors recommend using one of a number of other books for an introduction to the language, and they give a number of specific suggestions. Also suggested is the Ada Reference Manual (ARM), which is truly complete. And enormous (947 pages). And not organised for the novice. I find the ARM more accessible than the C standard, but it's not a fun beach read. Actually, it's not much fun at all. But Chapter 2 of their book does contain a good high-level introduction to the language, and the section on types is something every non-Ada programmer should read. Many of us grew up on assembly language and C, both of which have weak-to-nonexistent typing. If you've vaguely heard about Ada's strong types you probably don't really understand just how compelling this feature is. One example is fixed-point, a notion that's commonly used in DSP applications. On most processors fixed point's big advantage is that it's much faster than floating point. But in Ada fixed point has been greatly extended. Want to do financial math? Floats are out due to rounding problems. Use Ada's fixed point and just specify increments of 0.01 (i.e., one cent) between legal numeric representations. Ada will insure numbers never wind up as a fraction of a cent. About a quarter of the way into the book the subject matter moves from Ada in general to using Ada in embedded, real-time systems, which seems to get little coverage elsewhere even though the language is probably used more in the embedded world than anywhere else. Strong typing can make handling bits and memory mapped I/O a hassle, but the book addresses this concern and the solutions ("solutions" is perhaps the wrong word as the language has resources to deal with these low-level issues) are frankly beautiful. I've always like David Simon's "An Embedded Software Primer," for its great coverage of real-time issues. "Building Parallel, Embedded, and Real-Time Applications with Ada" is better at the same topics, though is a more demanding read. The two chapters on communications and synchronisation are brilliant. Chapter 6 covers distributed systems, one of Ada's strengths. The couple of other Ada books I have don't mention distributed systems at all, yet computing has been taken over by networks of computers and many-core processors. For instance, Ada provides pragmas and other resources to control activities distributed among various processors either synchronously or asynchronously. The book does a good job of showing how CORBA can enhance some of the ambiguity and unspecified mechanisms in Ada's Distributed Systems Annex (an extension to the Ada standard). This is not simple stuff; the book is excellent at getting the ideas across but expect to be challenged while digging through the example code. Chapter 7 is "Real-time systems and scheduling concepts." It's a must-read for anyone building real-time systems in any language. The authors cover rate-monotonic and earliest-deadline-first scheduling better than any other resource I've read. They show how one can use a little math to figure worst case response time and other factors. Ada is not mentioned. But later chapters show how to use Ada with multi-tasking. Also somewhat unusual in Ada tomes, the book does cover the Ravenscar profile. Ravenscar disables some Ada features to reduce some of the overhead, and to make static analysis of a program's real-time behaviour possible. As most people know, Ada has its own multi-tasking model. But some of this is specified by the standard as optional, and so the book addresses using POSIX as an alternative. The book is very well-written, though sometimes a bit academic. The use of language does tend towards precision at the occasional cost of ease-of-reading. Every chapter ends with a bulleted summary, which is excellent, and a set of exercises. I don't find that the latter adds anything as no solutions are given. In a classroom environment they would make sense. A lot of us get set in our ways. Even if you never plan to use Ada it makes sense to stretch the neurons and explore other languages, other approaches to solving problems. Heck, a friend in Brazil gave me a book about the Lua language which was quite interesting, though I've never written a single line of Lua and probably never will. "Building Parallel, Embedded, and Real-Time Applications with Ada" is one of those volumes that makes you think, especially about the hard problems (like real-time, multi-tasking and multi-core) facing the firmware world today. I recommend it. Have you explored alternatives to C/C++? What do you think of them?  
  • 热度 18
    2013-2-26 20:41
    2077 次阅读|
    2 个评论
    In a recent column , Jack Ganssle enthusiastically greets the arrival of Ada 2012 because it incorporates features and capabilities that he has been working hard to convince C and C++ embedded systems programmers to adopt explicitly and after the fact into their programming practices. Even with the new capabilities incorporated into Ada 2012, there remain sceptics in the developer community who do not think Ada will be a factor in the mainstream of embedded programming. Part of that is due to the fact that Ada was developed in the early 1980s as a programming language for safety-critical military and aerospace applications. But the nature of the embedded systems that software developers work on is changing as the devices they develop insert themselves into our daily lives in a way that the desktop computer never did. They are in the power grid that serves our homes, in the meters that monitor our power useage, in virtually every electric appliance and entertainment device, in our light bulbs, in our phones, in our automobiles and in new intelligent medical monitoring devices – and many of them are connected to each other, to each of us and to the broader internet. This new environment is certainly one in which something closer to the requirements for safety-critical design is needed. If not safety-critical, then dependable, absolutely dependable. In some environments, such as in the auto, and in the increasing number of medical monitoring devices we use, this need is an obvious no brainer. In others, the failure of the embedded design may be only an irritating inconvenience for the user. But if such failures happen often enough, those failures may be critical to the continued existence of the company that built them. Also, many of the standards organisations that serve the needs of various industry segments – industrial, consumer, medical, transportation—are adopting software reliability and dependability requirements every bit as demanding as those in safety-critical military/aerospace applications. But if you're still convinced that you do not need anything other than C or C++ in your in your embedded designs, consider the argument of Greg Gicca of Adacore in "Students need to learn multiple programming languages" – that learning about and using another programming language such as Ada might help you think better and write better C/C++ code. If it is true for computer science or electrical engineering majors in college, it is true for professional programmers as well. " Understanding just a single language promotes solutions that only approach a problem from a single perspective ," he writes. " Knowing multiple languages allows the problem to be looked at from a variety of perspectives so that multiple solutions can be compared and the most natural solution for the problem can be selected." One good example of this ability to think about C-code development in a different way is " Seventeen steps to safer C code ," by Thomas Honold, who was faced with the challenge of writing a C-based project with safety and reliability demands that C could not satisfy. His earlier experience with Ada in a number of earlier military/aerospace projects made it possible for him to come up with a set of techniques and procedures for use with C to satisfy the requirements of the project. The issue of the best programming language is a complex one. I would like to hear from you, in response to this article, or in blogs and articles for publication on the site and in the newsletters about your experiences and thoughts about programming in general, on Ada, on C or any other language with which you've had experience.  
  • 热度 11
    2013-1-22 15:41
    1501 次阅读|
    0 个评论
    In September last year, I wrote about the new Ada 2012 standard ( A look at ADA 2012 ). That article elicited quite a few responses, and some misinformation. So, I digested some of the discussion into questions, which I sent to the folks at AdaCore. They were kind to reply. Here's that discussion. Are Ada's run-time libraries generally written in Ada? If not, why is that? The only compiler I can speak about is GNAT. Yes, for GNAT, the run-time libraries are indeed written in Ada. In the case of GNAT, most of the compiler is also written in Ada. (The rest being the standard GCC technology, written in C.) In some circles Ada has long held a reputation of being bloated, in terms of size and performance of the generated code. I know there are some profiles, like Ravenscar, that can scale back the size quite a bit. How does the code generated by modern compilers compare to that from, say, C of C++ compilers? With today's technology, there's no reason why code generated from Ada would be generally slower—or faster—than that of a program doing the same thing written in C or C++. A couple of interesting points are relevant here: - The fundamental computation model of Ada is imperative, and most concepts of imperative programming are available with semantics that are very close to those of these other languages. There is no reason why those in Ada would be slower. - In the case of GNAT, the code generator is the same, whether you compile for C, C++ or Ada and that's where the bulk of the optimisations are performed. As a result, for the same algorithms, the performance will be the same. - One common mistake in comparing the performance of Ada with that of C is to compare Ada with checks enabled and C without checks. Indeed, if you assign a value to a numeric type in Ada, the compiler will generate a check verifying that the value is in range, which takes more time than just doing the assignment. However, we then have two programs that do different things so the difference in performance is reasonable. It's perfectly possible to produce the same performance in Ada by altering the semantics, in this case removing those checks. - There are a number of cases where code generated by Ada is actually faster than C or C++ code because the language provides semantics that can be better optimised by the compiler than code that's written manually. Subtype checking is a good example of that. - For run-time size, comparing Ada with C++, which offers the same level of capabilities (object orientation, exception, multi-threading, etc), is the most relevant. It shows a comparable run-time footprint. Indeed, for smaller systems, smaller Ada run-times exist: with GNAT, you can go all the way down to no run-time at all. Little support Readers complain that there's little support for Ada on 8 and 16bit processors. Is that the case? Ada tends to be used on relatively large systems that require lots of processing, so our customers are not pushing very hard in the direction on small processors. That being said, if you look at the GCC technology, all GCC targets could potentially have support for Ada. In particular, we have AVR 8 bits ports. So it's not really a technical limitation at this stage, more where the market pressure is coming from. One very interesting argument I've seen before is that Ada by itself adds little to the development of highly-reliable code. Some people suggest that Ada users are just better or more careful developers, so would excel working in any language. It's an interesting thought, since it implies that some (or too many) non-Ada people are less competent or less careful. Care to comment? I would go even beyond that and say that the language by itself has little impact on the reliability on the code being developed. What really matters is the development environment: the language, its toolchain (compiler, debugger), its editing environment (IDE), its analysis tools (static and dynamic) and its testing capabilities. And, of course, the competence of the team. What the languages provide are foundations to put this environment together, in particular with regard to static analysis. Many errors can be detected at compile time and an entirely new class of errors can be detected by the static analysis tools, much more than C. This is really a key to developing reliable software. Now to the developer competence discussion. Thinking that a bad developer can do a good job because he has a good language is a myth. No matter what the technology, you need good, competent people. Ada allows these developers to be more productive because they can develop software at the right level of abstraction and use the semantics of the language to obtain guarantees and proofs that they would otherwise only gather through painful analysis. Another interesting point relates to big systems developed and maintained over decades. With millions lines of code, no matter how good the team is, nothing can prevent you from making mistakes (not even Ada). What you need are all possible means to mitigate as many mistakes as possible and concentrate on the few that can't be prevented. That's what Ada is offering. Finally, C provides surprisingly little support for the separation between specifications and implementation or helping with global program architecture.For example, headers files are merely a convention with no dedicated language features behind them, there are different ways of handling global variables, and C doesn't provide namespaces while C++ allows distributing them across several files. Ada provides a nice way of identifying programming modules, separating the specifications from the implementation, and providing the specification with high level semantics such as strong typing, parameter modes (e.g., readonly, and read-write), data ranges, etc. In a way, this makes Ada usable as a programming tool as well as a design tool. Sacred cows One common complaint whenever anyone takes a swipe at a sacred cow like C is "good craftsmen shouldn't blame their tools." Should they? There's a reason why a craftsmen uses a hammer over a stone, or why developers chose C over assembly language. However, blaming the success or failure of a project on its language is always a mistake: it's a combination of the development environment and people's competence. What matters most is to select a tool that's fit for the task at the beginning and make the best use of it as possible. Using C doesn't make the task of writing reliable code impossible, nor will it inevitably produce a failure, but there are many reasons why choosing Ada development tools are likely to reduce costs and produce a better outcome. What trends do you see in the adoption of Ada? One recent article on embedded.com title "Unexpected trends" pegs Ada use at around 4% of the embedded space. Is that changing? First of all, we need to agree on what "embedded space" means. Are we talking about mobile devices? Embedded software on an aircraft? Temperature control programs running on my heating system? With regards to embedded development, Ada makes the most sense when some reliability aspects need to be taken into account. That's the only market that's really worth looking at. For example, although I'm strongly committed to the language, I wouldn't use it for a commercial phone application. So to answer to your question, taking into account the general embedded market, I would expect numbers to look a bit smaller for Ada, and a lot smaller for C or C++, when compared to languages such as Java and Objective C. But that reflects a market change within the embedded space more than anything else—embedded development on mobile platforms such as iOS or Android has exploded over the past few years. This doesn't mean Ada usage has fundamentally changed. Indeed, if you concentrate on the reliable software niche, what we're seeing from our (admittedly biased) vendor point of view is an increase in interest and usage of the language, sometimes in areas where we wouldn't have thought it would have been used previously. It also appears there's increasing attention being given to concerns such as certification or formal proof, where Ada provide notable advantages over C. This is probably one of the reasons that explain this increase of interest. On fascinating comment was "Add Coverity, add KLEE, add Test Driven Development, add valgrind, add all the other checking tools and it becomes harder to make the errors that people blame C for." Do you think these extra tools Ada-ize C? That's the counterpart of "good craftsmen shouldn't blame their tools", isn't it ;-)? These tools do add useful verification to C, and they do improve C code reliability. But they don't really Ada-ize C—as a matter of fact, you will need similar set of tools for Ada as well. That's the development environment I was talking about. The real question is, with these tools, what level of reliability are you achieving? If buying a complex C static analysis tool merely brings you to the level of a standard Ada compiler, why bother? And yet, if you have the funds to buy that static analysis tool, why not buy an Ada one, and reach an order of magnitude higher standards? Ada not available for low-end ARM Another reader commented that Ada isn't available for lower-end ARM CPUs like the Cortex-M series. Surely these parts are going to be huge in the embedded space. Is that comment correct, and do you see this changing? Our core market was slower than the general embedded market in this migration to ARM, but we do see it. As a matter of fact, we will be releasing our first ARM port in the coming months SPARK, of course, offers even lower bug rates than regular Ada. But it does require much more of a mathematician's mindset in crafting the annotations. What is your take on SPARK? Is there any reason to prefer regular Ada over SPARK? Most of today's formal methods do require a strong mathematician mindset. This is true for SPARK as well as many similar languages and tools. On the other hand, the level of reliability you can achieve goes way beyond what any other methodology could bring. It's beyond lowering the bug rate, but actually formally verifying the correctness of the specification. But the incentive to dothis, as of now, indeed has to be very strong. However, there's no reason to see formal methods as the opposite of standard programming languages or SPARK as the opposite of Ada. Indeed, it's perfectly reasonable to consider a continuum stretching from code being verified by test to code being verified by formal methods, allowing an application to combine both. A new generation of SPARK that Altran and AdaCore are currently working on allows this. As a result, an engineer with no particular mathematician background will be able to iteratively introduce formal methods wherever it's simple enough and makes sense, using traditional methods for the rest of the code. There's much more to say about the above, but the first version of this effort is part of a project called Hi-Lite, available from www.open-do.org/projects/hi-lite/ . Ada was originally a DoD mandate in an effort to get better code into their products. Yet the F-35 is written in C++. What happened there? I don't have any input on this story, so I can't comment on this particular one. However, I've seen many occurrences of migration from and to Ada and can make some guesses. There's often political reasoning when moving from Ada to C++. In particular in the defence domain, lots of today's managers were developers in the '80s, back when Ada tools where extremely poor, and they kind of stuck to this understanding. Therefore, they're trying to use anything other than Ada. There's also a belief, again at the management level, that the languages written on a developer's resume must match the languages in which the project is developed. Of course, any decent technical person would tell you that the language is the least of their problem when moving to a new project, but that's still something that a number of managers are hung up on. On the other hand, we've seen opposite moves, from Java, C or C++ to Ada. What I have personally witnessed is that while the choice to move from Ada to C or C++ is often initiated from the management level, the move to Ada is often initiated from the technical team. I always look at these issues not from a technology basis, but from a business perspective. It seems to me the selection of a language comes down to some tech issues (e.g., size and speed of the code) but more to business concerns: productivity and quality. How does Ada compare to C and C++? There's no such thing as a language that allows better productivity than another one, generally speaking. It all depends on the context. I wouldn't use Ada to develop a mobile application, a video game or a web server. My productivity would be lowered due to the lack of standard environments and tools for these purposes, and I although I would probably achieve a higher-reliability result, this reliability would not likely be one ofmy initial requirements. However, if I'm developing an application that must not fail, then using Ada will most definitely allow higher productivity. Take something as simple as type range checking. With C or C++, I would need to manually write verification that the numeric values being manipulated are within range. With Ada, the developer does it once, at type definition level, is sure not to forget a spot, and can change this in only one place. And on top of that, tools are automatically aware of the ranges. There are many examples like that. Metrics, anyone? One of my pet peeves is that the vast majority of embedded people keep no metrics at all, so talking about productivity or bug rates is usually just that: talk. Of course, the vast majority of embedded people code in C or C++. Are the Ada folks any more methodical in keeping metrics? There's really little difference that I can see between an Ada developer and a C developer if they're in the same market. I've seen everything from people really methodical in keeping these metrics to people that barely knew they exist. However, there's a general trend in the embedded market towards more and more static metrics, no matter what language. Here again, however, Ada can bring advantages, as the static analysis tools tend to be more precise (see previous discussions on this). A lot of the numbers one sees for Ada come out of the safety-critical world like those who have to conform to higher levels of DO-178C. But since so much C code is not safety-critical, and so does not incur the expense of MC/DC tests and the like, comparing that to 178C seems like comparing apples and oranges. How does Ada measure up when used in non-safety critical applications? Let me first define non-safety as non-safety-but-still-reliable applications. In this context, there's no doubt Ada is way more efficient than C. Just looking at coding standards for C makes things obvious enough—a fair amount of the rules in those don't make sense in Ada because those rules are already included in the language. Where there's no reliability-oriented concern whatsoever, i.e., little in the way of coding standards and testing and no concerns about maintenance or scalability, then it becomes harder to tell. At this stage, whatever you're used to would fit, I guess. I personally would have a preference towards Visual Basic. If you were trying to sell the use of Ada to dyed-in-the-wool C programmers, what would be your principle arguments for adopting it? I don't like debugging. I'd rather spend more effort into specifying constraints in my application.This allows me to use static verification or early dynamic checks rather than enduring random and painful debugging session after the integration phases. But that's just me ;-) Conversely, where do you see C as trumping Ada? You really want me to end this on a negative note on Ada, don't you ;-)? Let me make a more general statement. For any development starting today, in 2013, the only reason why I would see C used is when the environment doesn't provide any alternative. Maybe it's a small custom chip for which the manufacturer doesn't provide anything other than C. Maybe it's a large legacy C application that can't be recompiled with more modern tools. Or maybe it's just a small function in a (non-Ada) system that needs dedicated performance control so you'd use C a bit like people used to use assembly code while trying to boost very localized performance. But for anything else, today, as of 2013, I can't see why you would select the C language. There are so many better alternatives out there: Java, Python, C++, Visual Basic, Ada, etc. All perform pretty well for their target applications. And I'm not even talking about domain specific languages, which have a yet better fit when applicable. Why on earth would you start using a language coming from the Cobol and Fortran era, with a list of vulnerabilities larger than anything that has been done since then, not implementing any of the modern software architecting paradigms? Nostalgia? Thanks much to AdaCore for their responses to my—and your—questions.
相关资源