原创 Q&A on Ada 2012

2013-1-22 15:41 1498 11 11 分类: 消费电子

In September last year, I wrote about the new Ada 2012 standard (A look at ADA 2012). That article elicited quite a few responses, and some misinformation. So, I digested some of the discussion into questions, which I sent to the folks at AdaCore. They were kind to reply. Here's that discussion.


Are Ada's run-time libraries generally written in Ada? If not, why is that?
The only compiler I can speak about is GNAT. Yes, for GNAT, the run-time libraries are indeed written in Ada. In the case of GNAT, most of the compiler is also written in Ada. (The rest being the standard GCC technology, written in C.)


In some circles Ada has long held a reputation of being bloated, in terms of size and performance of the generated code. I know there are some profiles, like Ravenscar, that can scale back the size quite a bit. How does the code generated by modern compilers compare to that from, say, C of C++ compilers?
With today's technology, there's no reason why code generated from Ada would be generally slower—or faster—than that of a program doing the same thing written in C or C++. A couple of interesting points are relevant here:


- The fundamental computation model of Ada is imperative, and most concepts of imperative programming are available with semantics that are very close to those of these other languages. There is no reason why those in Ada would be slower.
- In the case of GNAT, the code generator is the same, whether you compile for C, C++ or Ada and that's where the bulk of the optimisations are performed. As a result, for the same algorithms, the performance will be the same.
- One common mistake in comparing the performance of Ada with that of C is to compare Ada with checks enabled and C without checks. Indeed, if you assign a value to a numeric type in Ada, the compiler will generate a check verifying that the value is in range, which takes more time than just doing the assignment. However, we then have two programs that do different things so the difference in performance is reasonable. It's perfectly possible to produce the same performance in Ada by altering the semantics, in this case removing those checks.
- There are a number of cases where code generated by Ada is actually faster than C or C++ code because the language provides semantics that can be better optimised by the compiler than code that's written manually. Subtype checking is a good example of that.
- For run-time size, comparing Ada with C++, which offers the same level of capabilities (object orientation, exception, multi-threading, etc), is the most relevant. It shows a comparable run-time footprint. Indeed, for smaller systems, smaller Ada run-times exist: with GNAT, you can go all the way down to no run-time at all.


Little support
Readers complain that there's little support for Ada on 8 and 16bit processors. Is that the case?
Ada tends to be used on relatively large systems that require lots of processing, so our customers are not pushing very hard in the direction on small processors. That being said, if you look at the GCC technology, all GCC targets could potentially have support for Ada. In particular, we have AVR 8 bits ports. So it's not really a technical limitation at this stage, more where the market pressure is coming from.


One very interesting argument I've seen before is that Ada by itself adds little to the development of highly-reliable code. Some people suggest that Ada users are just better or more careful developers, so would excel working in any language. It's an interesting thought, since it implies that some (or too many) non-Ada people are less competent or less careful. Care to comment?
I would go even beyond that and say that the language by itself has little impact on the reliability on the code being developed. What really matters is the development environment: the language, its toolchain (compiler, debugger), its editing environment (IDE), its analysis tools (static and dynamic) and its testing capabilities. And, of course, the competence of the team.


What the languages provide are foundations to put this environment together, in particular with regard to static analysis. Many errors can be detected at compile time and an entirely new class of errors can be detected by the static analysis tools, much more than C. This is really a key to developing reliable software.


Now to the developer competence discussion. Thinking that a bad developer can do a good job because he has a good language is a myth. No matter what the technology, you need good, competent people. Ada allows these developers to be more productive because they can develop software at the right level of abstraction and use the semantics of the language to obtain guarantees and proofs that they would otherwise only gather through painful analysis.


Another interesting point relates to big systems developed and maintained over decades. With millions lines of code, no matter how good the team is, nothing can prevent you from making mistakes (not even Ada). What you need are all possible means to mitigate as many mistakes as possible and concentrate on the few that can't be prevented. That's what Ada is offering.


Finally, C provides surprisingly little support for the separation between specifications and implementation or helping with global program architecture.For example, headers files are merely a convention with no dedicated language features behind them, there are different ways of handling global variables, and C doesn't provide namespaces while C++ allows distributing them across several files. Ada provides a nice way of identifying programming modules, separating the specifications from the implementation, and providing the specification with high level semantics such as strong typing, parameter modes (e.g., readonly, and read-write), data ranges, etc. In a way, this makes Ada usable as a programming tool as well as a design tool.


Sacred cows
One common complaint whenever anyone takes a swipe at a sacred cow like C is "good craftsmen shouldn't blame their tools." Should they?
There's a reason why a craftsmen uses a hammer over a stone, or why developers chose C over assembly language. However, blaming the success or failure of a project on its language is always a mistake: it's a combination of the development environment and people's competence. What matters most is to select a tool that's fit for the task at the beginning and make the best use of it as possible. Using C doesn't make the task of writing reliable code impossible, nor will it inevitably produce a failure, but there are many reasons why choosing Ada development tools are likely to reduce costs and produce a better outcome.


What trends do you see in the adoption of Ada? One recent article on embedded.com title "Unexpected trends" pegs Ada use at around 4% of the embedded space. Is that changing?
First of all, we need to agree on what "embedded space" means. Are we talking about mobile devices? Embedded software on an aircraft? Temperature control programs running on my heating system?


With regards to embedded development, Ada makes the most sense when some reliability aspects need to be taken into account. That's the only market that's really worth looking at. For example, although I'm strongly committed to the language, I wouldn't use it for a commercial phone application.


So to answer to your question, taking into account the general embedded market, I would expect numbers to look a bit smaller for Ada, and a lot smaller for C or C++, when compared to languages such as Java and Objective C. But that reflects a market change within the embedded space more than anything else—embedded development on mobile platforms such as iOS or Android has exploded over the past few years. This doesn't mean Ada usage has fundamentally changed.


Indeed, if you concentrate on the reliable software niche, what we're seeing from our (admittedly biased) vendor point of view is an increase in interest and usage of the language, sometimes in areas where we wouldn't have thought it would have been used previously. It also appears there's increasing attention being given to concerns such as certification or formal proof, where Ada provide notable advantages over C. This is probably one of the reasons that explain this increase of interest.


On fascinating comment was "Add Coverity, add KLEE, add Test Driven Development, add valgrind, add all the other checking tools and it becomes harder to make the errors that people blame C for." Do you think these extra tools Ada-ize C?
That's the counterpart of "good craftsmen shouldn't blame their tools", isn't it ;-)?


These tools do add useful verification to C, and they do improve C code reliability. But they don't really Ada-ize C—as a matter of fact, you will need similar set of tools for Ada as well. That's the development environment I was talking about.


The real question is, with these tools, what level of reliability are you achieving? If buying a complex C static analysis tool merely brings you to the level of a standard Ada compiler, why bother? And yet, if you have the funds to buy that static analysis tool, why not buy an Ada one, and reach an order of magnitude higher standards?


Ada not available for low-end ARM
Another reader commented that Ada isn't available for lower-end ARM CPUs like the Cortex-M series. Surely these parts are going to be huge in the embedded space. Is that comment correct, and do you see this changing?
Our core market was slower than the general embedded market in this migration to ARM, but we do see it. As a matter of fact, we will be releasing our first ARM port in the coming months


SPARK, of course, offers even lower bug rates than regular Ada. But it does require much more of a mathematician's mindset in crafting the annotations. What is your take on SPARK? Is there any reason to prefer regular Ada over SPARK?
Most of today's formal methods do require a strong mathematician mindset. This is true for SPARK as well as many similar languages and tools. On the other hand, the level of reliability you can achieve goes way beyond what any other methodology could bring. It's beyond lowering the bug rate, but actually formally verifying the correctness of the specification. But the incentive to dothis, as of now, indeed has to be very strong.


However, there's no reason to see formal methods as the opposite of standard programming languages or SPARK as the opposite of Ada. Indeed, it's perfectly reasonable to consider a continuum stretching from code being verified by test to code being verified by formal methods, allowing an application to combine both. A new generation of SPARK that Altran and AdaCore are currently working on allows this. As a result, an engineer with no particular mathematician background will be able to iteratively introduce formal methods wherever it's simple enough and makes sense, using traditional methods for the rest of the code.


There's much more to say about the above, but the first version of this effort is part of a project called Hi-Lite, available from www.open-do.org/projects/hi-lite/.


Ada was originally a DoD mandate in an effort to get better code into their products. Yet the F-35 is written in C++. What happened there?
I don't have any input on this story, so I can't comment on this particular one. However, I've seen many occurrences of migration from and to Ada and can make some guesses. There's often political reasoning when moving from Ada to C++. In particular in the defence domain, lots of today's managers were developers in the '80s, back when Ada tools where extremely poor, and they kind of stuck to this understanding. Therefore, they're trying to use anything other than Ada. There's also a belief, again at the management level, that the languages written on a developer's resume must match the languages in which the project is developed. Of course, any decent technical person would tell you that the language is the least of their problem when moving to a new project, but that's still something that a number of managers are hung up on.


On the other hand, we've seen opposite moves, from Java, C or C++ to Ada. What I have personally witnessed is that while the choice to move from Ada to C or C++ is often initiated from the management level, the move to Ada is often initiated from the technical team.


I always look at these issues not from a technology basis, but from a business perspective. It seems to me the selection of a language comes down to some tech issues (e.g., size and speed of the code) but more to business concerns: productivity and quality. How does Ada compare to C and C++?
There's no such thing as a language that allows better productivity than another one, generally speaking. It all depends on the context. I wouldn't use Ada to develop a mobile application, a video game or a web server. My productivity would be lowered due to the lack of standard environments and tools for these purposes, and I although I would probably achieve a higher-reliability result, this reliability would not likely be one ofmy initial requirements.


However, if I'm developing an application that must not fail, then using Ada will most definitely allow higher productivity. Take something as simple as type range checking. With C or C++, I would need to manually write verification that the numeric values being manipulated are within range. With Ada, the developer does it once, at type definition level, is sure not to forget a spot, and can change this in only one place. And on top of that, tools are automatically aware of the ranges. There are many examples like that.


Metrics, anyone?
One of my pet peeves is that the vast majority of embedded people keep no metrics at all, so talking about productivity or bug rates is usually just that: talk. Of course, the vast majority of embedded people code in C or C++. Are the Ada folks any more methodical in keeping metrics?
There's really little difference that I can see between an Ada developer and a C developer if they're in the same market. I've seen everything from people really methodical in keeping these metrics to people that barely knew they exist. However, there's a general trend in the embedded market towards more and more static metrics, no matter what language. Here again, however, Ada can bring advantages, as the static analysis tools tend to be more precise (see previous discussions on this).


A lot of the numbers one sees for Ada come out of the safety-critical world like those who have to conform to higher levels of DO-178C. But since so much C code is not safety-critical, and so does not incur the expense of MC/DC tests and the like, comparing that to 178C seems like comparing apples and oranges. How does Ada measure up when used in non-safety critical applications?
Let me first define non-safety as non-safety-but-still-reliable applications. In this context, there's no doubt Ada is way more efficient than C. Just looking at coding standards for C makes things obvious enough—a fair amount of the rules in those don't make sense in Ada because those rules are already included in the language.


Where there's no reliability-oriented concern whatsoever, i.e., little in the way of coding standards and testing and no concerns about maintenance or scalability, then it becomes harder to tell. At this stage, whatever you're used to would fit, I guess. I personally would have a preference towards Visual Basic.


If you were trying to sell the use of Ada to dyed-in-the-wool C programmers, what would be your principle arguments for adopting it?
I don't like debugging. I'd rather spend more effort into specifying constraints in my application.This allows me to use static verification or early dynamic checks rather than enduring random and painful debugging session after the integration phases. But that's just me ;-)


Conversely, where do you see C as trumping Ada?
You really want me to end this on a negative note on Ada, don't you ;-)? Let me make a more general statement. For any development starting today, in 2013, the only reason why I would see C used is when the environment doesn't provide any alternative. Maybe it's a small custom chip for which the manufacturer doesn't provide anything other than C. Maybe it's a large legacy C application that can't be recompiled with more modern tools. Or maybe it's just a small function in a (non-Ada) system that needs dedicated performance control so you'd use C a bit like people used to use assembly code while trying to boost very localized performance.


But for anything else, today, as of 2013, I can't see why you would select the C language. There are so many better alternatives out there: Java, Python, C++, Visual Basic, Ada, etc. All perform pretty well for their target applications. And I'm not even talking about domain specific languages, which have a yet better fit when applicable. Why on earth would you start using a language coming from the Cobol and Fortran era, with a list of vulnerabilities larger than anything that has been done since then, not implementing any of the modern software architecting paradigms? Nostalgia?


Thanks much to AdaCore for their responses to my—and your—questions.

文章评论0条评论)

登录后参与讨论
我要评论
0
11
关闭 站长推荐上一条 /2 下一条