validity of ... reasons for preferring C over C++

What is wrong with BLISS ?

It was a nice intermediate level language, in which much of VAX/VMS was written.

Reply to
upsidedown
Loading thread data ...

Ada toolchains remain a problem, but things are changing. AdaCore now provides a free (both free-beer and free-speech) ARM Cortex version of the GNU Ada compiler, GNAT. There is also a version for the AVR

8-bitters. The TI MSP430 is also being worked on. No Ada compiler yet (as far as I know) for the 8051 series.

I disagree. The set of problem-modelling tools in Ada is much stronger and more expressive than in C. It is much easier to think and design in problem-domain terms. A real module / package system, integer types defined by what you need and not by the implementation, private and abstract types, classes and inheritance, ...

Ada is of course more comparable to C++ than to C, but Ada has none of the inherited "C baggage" that distorts and weakens C++.

--
Niklas Holsti 
Tidorum Ltd 
niklas holsti tidorum fi 
      .      @       .
Reply to
Niklas Holsti

Care to list a few of those? Just to have a good debate? Been a while...

Circular argument. If more people would use it, there would be more users, more choice of tools, etc. On USENET, comp.lang.ada is a very active group which gives lots of help to new users. And recently there seems to be an influx of new users in that group.

--
Niklas Holsti 
Tidorum Ltd 
niklas holsti tidorum fi 
      .      @       .
Reply to
Niklas Holsti

For any given developer, there's nothing circular about the argument.

Reply to
Arlet Ottens

The last time I even thought about using Ada, I got scared off by the size of the runtime you needed to go with it (tens of KB). Anyone know if that's still the case?

--
Rob Gaddi, Highland Technology -- www.highlandtechnology.com 
Email address domain is currently out of order.  See above to fix.
Reply to
Rob Gaddi

How much non-military application has actually been written in any dialect of ADA ?

The only one I know of, were some Nokia MPS-10 systems intended for banking in the early 1980's.

Some countries, in the addition to the USA, required ADA for their military applications in the same era.

Reply to
upsidedown

You can choose to include the parts of the run-time libraries that you want to use. The GNAT compiler has a "zero foot-print" mode where the run-time is empty; of course this means that the application must manage without some Ada features (tasking, mainly). GNAT provides a range of run-times of various sizes and capabilities, and you can configure your own selection too.

Same thing applies to C/C++: if you want to use threads/tasks, you must include a kernel of some sort. Is "tens of KB" a lot these days?

That said, the main problem with Ada on microcontrollers today is still the limited range of compilers/targets. I haven't tried it, but the common opinion is that porting the gcc-based GNAT, and its run-time system, to new targets is not an easy task, even if gcc already supports the new target. Some people have succeeded to target ARM Cortex, Atmel AVR, and TI MSP430. The run-time systems in these "amateur" ports are still rudimentary, without tasking support, I believe.

The AdaCore GPL GNAT for ARM Cortex has a very usable tasking subset (the "Ravenscar" profile).

(Pro forma nit-pick: the name of the language is spelled Ada, after Ada Lovelace,

formatting link

Typical uses are airplane control systems (civilian and military), air traffic control systems (mainly civilian), railroad control systems, space applications (civilian and military). I'm told that banking/finance applications exist. More listed at

formatting link

Some other resources: Reference manuals at

formatting link
Free compilers at
formatting link
also available in many Linux distributions (best support in Debian). Also available in the MinGW gcc environment for Windows,
formatting link

--
Niklas Holsti 
Tidorum Ltd 
niklas holsti tidorum fi 
      .      @       .
Reply to
Niklas Holsti

I've heard of an Ada to C translator and I've been meaning to look into it. I guess it would be like the early implementations of C++ that translated C++ to C. You'd translate your Ada code and then run it through a C compiler, using C as a portable assembler.

Reply to
Paul Rubin

My experience with Ada is rather limited, so you might end up teaching me rather than getting a good debate. But of course others will no doubt join in.

An obvious (but highly subjective) irritation with Ada is the verbosity of the language - lots of things need repeated, and many of the constructs are more wordy than necessary. Using words rather than symbols is not necessarily a bad thing, within limits - C++ arguably relies too much on symbols, as anyone trying to read a lambda function will know. But sometimes Ada reads as though you are chatting to the computer rather than programming.

Ada programming encourages the use of user-defined types for all sorts of things. You are not supposed to hold a "day" in an "int" or "uint8_t", you are supposed to define "type Day_type is range 1 .. 31;". Sometimes this sort of thing can make code clearer, but it can also make it harder to see what is really going on in the program. When you see a type like "uint8_t" or "int", you know exactly what it means - when the type is "Day_type", you have to think about it much more, and perhaps look up the definition. And because type conversions have to be explicit in Ada, you need to add lots of them when using these types. C and C++ do more of this automatically, giving clearer code.

The run-time overhead in Ada can be an issue - the larger run-time library, the run-time checks, exceptions (I don't like them in C++ either).

Of course, the biggest disadvantages of Ada compared to C or C++ are not because of the language itself, but because of the relative popularities

- there are fewer Ada tools, fewer Ada developers, fewer libraries, RTOS's, network stacks, etc., less example code, and so on.

And there are things that Ada has that I would love to see in C and/or C++, such as the ' symbol that lets you write Day_type'Last.

Of course it is a circular argument. Lots of things we use or do all the time are used primarily because they are popular, and they are popular primarily because people use them - despite there being technically superior alternatives. We program 8051 microcontrollers, type with QWERTY keyboards, use Windows systems, eat at McDonalds, and listen to Britny Spears - even though they are all hopelessly bad compared to alternatives, they exist because they are popular.

Reply to
David Brown

This is called "typeful programming" (search on the phrase) and the idea is it helps the compiler catch errors in the code.

After using Haskell for a while, I've gotten to hate automatic conversions. They make the code obscure and scary compared to explicit conversions. I never know if some unwanted conversion is going on in the background. If I passed the wrong type by accident and the compiler goes and converts it automatically, what's the point of having types?

I think this is a matter of what runtime profile you choose. Ada has lots of different profiles including some intended for small embedded processors that don't support fancier features like tasking (not sure about exceptions). Runtime checks for stuff like integer range errors are typically compile-time options.

True.

Saved to quote file :)

Reply to
Paul Rubin

But do you find it more difficult to *read* Ada because of this verbosity? (One of the basic rules of software engineering is that you must optimzie for the reader, not for the author.)

Funny you mention lambdas in this context. I just finished giving a C++ course, which I revised to include C++ 0x11 features, including lambdas. I had never written any lambda untill 4 months ago. They still feel strange to me, and my college that assisted in the course still thinks they are something totally weird.

But for the students a lambda seems to be much easier to understand than exceptions or virtual methods. I think experienced programmers (myself included) are often much less eager to adopt a new feature than novices.

Wouter van Ooijen

Reply to
Wouter van Ooijen

Sharp tools cut. What's your point?

Reply to
Clifford Heath

Read the rest of my posting, i.e. the bit you snipped.

If you can't be bothered to do that you are no better than a troll. Other people have succeeded and we've had a reasonable discussion. Please don't spoil it unless you have something to contribute.

Reply to
Tom Gardner

Even a sharp tool should have a comfortable and safe handle. You don't want the handle to have surprising sharp edges or to collapse if you hold it in the wrong way.

As a tool, a programming language has both a "handle" part, which is the rules and principles of the language, for the programer to "hold" as she guides the tool by writing programs, and a "sharp edge" part, which is the running program, which changes the world around it and carves out the nuggets of desired results from the default chaos.

--
Niklas Holsti 
Tidorum Ltd 
niklas holsti tidorum fi 
      .      @       .
Reply to
Niklas Holsti

Nicely put, but I hope this doesn't degenerate into a lot of hot air about dubious analogies!

Reply to
Tom Gardner

I'm not sure if that is quite the case today. You can do actual "work" with GNAT for Mindstorms NXT, where you have the application and run-time for ARM Cortex M series in 64 kb. (The run-time may still account for a few tens of kb.)

When I target the 8 bit Atmel AVR MCUs I usually work with a "zero-footprint run-time".

Greetings,

Jacob

--
"I just might be wrong."
Reply to
Jacob Sparre Andersen

The (still working) Vermont Technical College cubesat was developed that way. - Except that they actually developed in SPARK to ensure the absence of run-time errors.

Greetings,

Jacob

--
"Hungh. You see! More bear. Yellow snow is always dead give-away."
Reply to
Jacob Sparre Andersen

Yes, I find it harder to read (but again, I stress my limited experience

- any language is easier to use after more practice). When reading C, I find the syntax and the common identifiers contrast with function names, variables, and other identifiers, making it easier to see the structure of the code. Ada just seems to have too many words for my liking - it reads like a school essay.

There is some evidence suggesting that the "errors per line of code" rates is fairly independent of the programming language - and with all other things being equal (which they seldom are), a more compact language will have a lower bug rate than a more verbose one. I believe this is simply a matter of the amount of information that you can easily see and process at a time - this is why there is a common rule of keeping your functions shorter than one screenfull.

Of course I agree with you that making code easy to read and understand is important - languages (and identifiers in the language) should not be made short to save keystrokes.

I use lambdas regularly in Python - where they are defined using the keyword "lambda". So I am quite happy with the concept of lambda functions - I just think that the C++ syntax for them is going to take quite a while to get used to.

Reply to
David Brown

Yes, and it can sometimes be helpful in that way - but it can also mean that you need extra code to deal with the conversions, and that means extra scope for errors. It works both ways.

OK. I haven't tried Ada on embedded systems, and haven't looked at this in detail.

I hope you corrected the spelling of "Britney" before saving it! But it's nice to know I've written something of interest to someone - it's a rare thing on Usenet.

Reply to
David Brown

Do you know what Ada-to-C translation tools they used? What do the tools do about the Ada runtime? Thanks.

Reply to
Paul Rubin

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.