OO languages

Presuming the "7x" means that Ada code had 1/7 (14%) the defects of C code, that's "86% fewer" not 700% fewer. Or am I completely misreading the sentence?

--
Grant Edwards                   grante             Yow! I'm a nuclear
                                  at               submarine under the
 Click to see the full signature
Reply to
Grant Edwards
Loading thread data ...

Yeah, bad language, ugh and sorry -- but I just quoted. Earlier in the quote, "700%" means the same as "7 times", so no doubt "440%" means that the C++ code had on the average 4.4 times as many errors, per SLOC, as the Ada code. In other words, the error density in Ada code was about

1/4 of that in C++.

If we can pass over this question of proper use of per cent, the numbers are pretty impressive, right? So why isn't Ada used more? I suspect that many C/C++ programmers feel a bit offended by statistics like these; sort of thinking "If that's right, then I must be stupid to use C/C++... Are you calling me stupid? No way!"

In practice, there are many good reasons for choosing C in many embedded projects -- the compilers are often cheap and always available, there may be example code and legacy code in C, sick with what you know, etc. But for larger projects, perhaps with a long life, these studies suggest that it could be cheaper to pay for Ada tools, perhaps even to select a processor for which an Ada compiler is available. And not to forget that GNU Ada (gnat) is available for many 32-bit targets, and there are Ada compilers that emit C code and so support almost any target processor (or so their vendors claim, unfortunately I haven't had the occasion to try them).

--
Niklas Holsti
Tidorum Ltd
 Click to see the full signature
Reply to
Niklas Holsti

Even then, such claims are ridiculous.

While string processing in C is very error prone, because C does not have any real string processing better than Fortran IV, even Fortran

77 is better.

In the real time environment, the need for string handling is quite minimal, so if the user interface needs something to display, it shouldn't harm the display of the actual signal.

Paul

Reply to
Paul Keinanen

I didn't mean to imply that you were teh author of the sentence, as it's clearly a quote from another document.

Yes. Taken on face value, they are impressive.

The only data point I have first hand is from the late 80's. The first and only time I used Ada the toolset and development environment was amazingly horrible. It was on a VAX/VMS system. Rather than a stand-alone toolchain that could be used alongside normal VMS stuff (text editors, file utilities, build systems, configuration management) like all the other compilers (Pascal, C, FORTRAN, etc.), the Ada compiler lived it its own completely isolated (and very crippled) "world". It had its own command-line interface, its own useless editor, build management system, source control system, and _even_its_own_file_system_. You could import/export files to the normal VMS filesystem system but it wasn't easy.

It was absolutely awful.

The intention was apparently that the Ada "environment" (more aptly called a "torture chamber") would be completely host system independent and standardized so that the user would have the same identically painful and unproductive experience under VMS as he would on Unix or OS360 or any other host system. I don't know how successfully the goal of system-independance was achieved but I can vouch for the fact the the result was almost impossible to use.

I had used Pascal for a lot of embedded stuff in the past and thought it worked very well, and I actually rather liked the Ada language. But, it was _by_far_ the worst language implementation I'd ever used (included some pretty bad batch-mode punched-card based stuff).

Later Pascal's popularity waned and I had to start using C for embedded stuff. It was definitely a step backwards in terms of reliability and quality of resulting code.

I think the vast majority of embedded projects would be much better off using something like Pascal, Modula 2/3, Oberon or Ada instead of C or C++.

--
Grant Edwards                   grante             Yow! I wonder if I should
                                  at               put myself in ESCROW!!
 Click to see the full signature
Reply to
Grant Edwards

From what I have read/heard Ada appears to me a very interesting programming language. What prevented me to invest time to learn it is that none of my clients uses it.

Another consideration is the availability of programmers who actually are proficient with the language. I couple of times a year I speak someone working for a company which has been using Ada for their products more than a decade. That company moving away from Ada, the most important reason being that they have a difficulties finding Ada programmers. Since they often hire programmers on temporary bases, having access to people which have alwith the required skills readily available is an important consideration.

Also the availability of libraries and tooling is something to consider.

The lack of an ecosystem like there is for C, and to lesser degree C++, probably prevents Ada from gaining momentum. Without a sufficiently big player willing to invest to make Ada popular (like Sun did with Java and Microsoft did with C#), Ada will likely remain a niche language, despite its merits.

I suspect once a programming language is 'good enough', other aspects become more important than the qualities of the programming language itself.

Reply to
Dombo

No they are just numbers as any language and humans comparison the error rates (percentage, degree of error, deviations..) is VERY difficult to determine.

Akin to "for a thousand people exposed to swine flu what are the relative percentages of those who are not affected, and all the other stages through to how many will die". Depends on so many factors that each group of 1000 people have many other factors to NOT fit the pattern.

Are the defects related to

What was the sample size for each comparison language compiler usage of compiler different competency levels of programmers different abilities/speed of tools used different environment programmer working in different coding standards environments Were the defects due to differing specifications Were the defects due to other problems beyond control (API etc..) Were the defects due to feature creep and poor timescales Were the defects due to poor design at the BEFORE coding stages Were the defects due to poor or non-existant testing plans

Many, many other factors!!!

Inertia to change, supply of C/C++ programmers, not needing to cross train development teams, time to market, perceived costs, perceived applicability of language is the 'magic bullet'.

these;

I have a screwdriver with a 1inch wide blade, as this is for use with screws therefore all screws are the same.

What other library support for what the programme is to be used in or with...

There are many reasons it might not be used, even as far as there are one hell of a lot of NON-32 bit targets

--
Paul Carpenter          | paul@pcserviceselectronics.co.uk
    PC Services
 Click to see the full signature
Reply to
Paul Carpenter
[snips throughout]

Yup. I can recall my first Pascal program to control a DataI/O PROM programmer: writing things like "printHex()" to convert a 4 bit value to a printable hex character [0-9A-F]; then using that in "printByte()" to print an 8 bit value as a pair of hex characters; then using *that* to make "printAddress()" to print a string a 4 (or 8) bytes as a set of hex characters, etc.

I've seen the same programmers struggle as the industry has moved from a

Wow! My experience has been exactly the opposite! Years ago (my first design was in the 70's) "software" was just another aspect of the hardware design. And, very much in *tune* with it. The processor was just a different way of doing something that you would otherwise do with sequential logic (e.g., build an FSM out of tonka-toy logic). It was EEs that did the work (at the time, my college didn't even offer a "CS" degree; it was "Electrical Engineering with the Computer Science option").

Nowadays, tools have made designing embedded systems *so* much more like designing for a desktop system. E.g., you don't have to burn ROMs, stuff them into your target, cross your fingers and *hope*. You don't have to rely on setting up "scope loops" so you can watch your code *with* a 'scope to try to figure out what is happening. You don't have to count *bits* of memory because you only had a hundred bytes (for a *big* system). You were lucky to have *an* assembler from *one* vendor for your processor. You had no choice of HLL's (I recall PL/M being "a big deal" and it's little more than "structured assembly language").

I think the current alleged problem is due to the fact that it *is* possible, nowadays, to do an embedded system design without even having the *hardware* available to run your code! (the idea of a simulator "way back when" was just a luxury to drool over; an *emulator* was something you shared with every other developer in the company as they were too damn expensive to provide "per seat")

The observation has been made (in other fields) that we suffer from the fact that we just "inherit" knowledge and don't go through the equivalent of an apprenticeship (like folks in The Trades). As a result, we don't get a "feel" for what we have "absorbed". And, don't get to appreciate what is behind it, etc.

I suspect this is largely true. However, the flip side -- the long apprenticeships, etc. -- would slow the pace of "progress" considerably in this field (i.e., how much have *toilets* changed in the past few hundred years...? :> )

Reply to
D Yuniskis

[snips throughout]

I just want to "solve problems" with the most available options at my disposal. Ada limits the size of the problem that you can practically solve and the hardware on which you can solve it. E.g., are you going to design a control system for a microwave oven on a little 8 (or even *4*!) bit processor in Ada? Are you going to find a compiler vendor who will support "that" processor?

Or, are you just forced to use more hardware (than necessary) in order to use that toolset?

Quite old. I recall the 1995 stuff being *announced* so this predated that.

You can get all of that with C -- but, you have to *work* at it! :>

I find the bigger bang (nowadays, with larger projects) is to put the effort into the OS. Let *it* serve as a safety net for the developer, etc. And, let it enhance the programming environment so the developer can call on richer subsystems to offload some problems at run-time that he would have to solve himself, otherwise.

For example, I've started implementing a "rational, decimal, variable precision math package" (yikes! that's a mouthful!) to offload some of the pesky "fringe areas" that standard math libraries don't handle well. Sure, a savvy user can order his math operators to preserve the most precision in his answers, etc. But, it seems easier to use resources (time and space) to provide a service that the developer can call upon to do these things "perfectly" with less effort.

Yes, but note that almost all of these are larger projects [included below for context] with large development staffs on "off the shelf hardware" (i.e., where the hardware was chosen as a consequence of the software requirements and not the other way around)

I'd be more impressed if google/nokia had claimed that they had developed the iPhone using Ada (etc.)

Reply to
D Yuniskis

Ah, why can't you just define them as static, file scope?

It was probably up late the night before! ;-)

Reply to
D Yuniskis

Exactly! I routinely do things through pointers (including changing program flow) as they are often an excellent efficiency hack. And, often let you do things that would otherwise be very difficult or clumsy to do (e.g., "Here's a pointer to a filter function that takes a pointer to a struct of data specific to that filter function and returns ")

I find many modern languages that *don't* let me use pointers (for fear that I will "hurt myself") to be very irritating. Sure, they usually provide a way to do something functionally equivalent but often at some extra *cost*.

Reply to
D Yuniskis

For what reason? We have Matlab and Simulink for this today, don't we? :-D

--
Cesar Rabak
GNU/Linux User 52247.
 Click to see the full signature
Reply to
Cesar Rabak

I did that once. In a minicomputer Basic, with the I/O directed out the terminal Aux port to the Data I/O. That particular Basic had no HEX$ function, so I had to roll my own. I had to go look up just how it did it -- it's been quite a while since I wrote the program. :-)

--
ArarghMail908 at [drop the 'http://www.' from ->] http://www.arargh.com
BCET Basic Compiler Page: http://www.arargh.com/basic/index.html
 Click to see the full signature
Reply to
ArarghMail908NOSPAM

One of the several "restriction" pragmas that you can apply to an Ada program is No_Implicit_Heap_Allocation. This tells the compiler to complain about any code that would make the compiler allocate heap memory without an explicit "new" in the source code.

--
Niklas Holsti
Tidorum Ltd
 Click to see the full signature
Reply to
Niklas Holsti

I wonder what "Ada still has 440% fewer defects" means. Or perhaps this is a defect?

But if "7 times fewer" is called "700% fewer," then 440% means 4.4 times fewer. So it has 77% fewer errors. Does not sound so good.

Programmers and those who would wield statistics should aim for clarity.

There was someone who attended an astronomy lecture. During it, he thought he heard the lecturer say the sun would expand to engulf the world in 10M years. Afterwards he checked this with the lecturer. No, the lecturer said, it was 10B years. "Ah, that's OK then."

Reply to
Bill Davy

The size of the language does not exclude small target systems. You can enjoy the static compile-time strengths of the "large" language with no or minor impact on the size of the target code. Of course, you have to limit your use of features that require run-time support, but that's true of all languages -- libraries take space.

I have used Ada on projects for 16-bit computers (1750) with 64 kW of memory, and the run-time overhead from the "large" language was very minor.

The language also helps you with small systems. It lets you define the size of your variables and lets you pack structure components in the minimum number of bits. You can do some of that in C, but not all of it.

The GNU Ada compiler is available for the 8-bit AVR.

formatting link

The SofCheck AdaMagic compiler emits C code, so if your processor has a good C compiler, you are OK.

formatting link

Well, that can be a good approach, whatever your programming language. It seems an orthogonal thing to me.

True, except for the model railroad projects. But small projects don't generate such statistics, and I suspect that skeptics would find it even easier to dismiss small-project statistics out of hand as insignificant.

Yes, that would have been nice. But saying that "X cannot be good, because Y is not using X" is a vicious-circle argument, at best.

--
Niklas Holsti
Tidorum Ltd
 Click to see the full signature
Reply to
Niklas Holsti

Yes, some/most of the early Ada implementations were horrible, as many people have attested.

Thankfully, and as I'm sure Grant knows, the current implementations are much better. The GNU Ada compiler gnat is a bona-fide gcc citizen, using ordinary source and object files, your choice of editor or IDE, and it can link from and to C/C++ code.

--
Niklas Holsti
Tidorum Ltd
 Click to see the full signature
Reply to
Niklas Holsti

To be sure it is a defect in the English quotation. But it a usage of per cent that seems to be rather common.

I would say that reducing errors by three quarters is "good". Furthermore, according to Ziegler's report the errors in the Ada code were significantly faster and cheaper to correct than the errors in the C/C++ code.

Indeed.

--
Niklas Holsti
Tidorum Ltd
 Click to see the full signature
Reply to
Niklas Holsti

And once you've chased people to a different language, they tend to stay there.

I haven't used any Ada implementations since, but I was pretty sure it was true. Unfortunately, by the time reasonable (in both price and usability) Ada implementations came out, C had already taken over the embedded space from Pascal -- which was very popular back in the Z80 and CP/M days. I remember when Tektronix sold some _very_ nice embedded development tools (HW and SW) centered around Pascal compilers. The labs we did in my undergrad days using high-level language on 8-bit machines used Pascal and HP in-circuit-emulators.

Someday I'd really like to try an embedded project in Gnu Ada, (or maybe Modula-3 or Oberon) but I doubt I'm ever going to be able to convince an employer to pay for the effort. I can already hear it...

"Where are we going to hire people with Ada experience?"

--
Grant Edwards                   grante             Yow! Someone in DAYTON,
                                  at               Ohio is selling USED
 Click to see the full signature
Reply to
Grant Edwards

Hire good programmers and train them in Ada using some of the time and money you will save by reducing error density by 75% :-)

There is some evidence that learning Ada improves programming skills in general, even if the student then uses other languages. I remember an anecdote told by a US professor of CS who found that those of his graduated students who had studied with Ada as their main language were later promoted quicker in their industrial C/C++ jobs than those who had studied using C/C++ (sorry, I don't have a reference). I know this sounds a bit like "BASIC damages your brain", and for all I know it may be explained simply by the Ada students learning one *more* language than the C/C++ students did -- every language increases your skills -- or by differences in the teachers. This was not a double-blind experiment.

--
Niklas Holsti
Tidorum Ltd
 Click to see the full signature
Reply to
Niklas Holsti

There is watchdog, heaps/stacks are monitored, there is the CPU exception mechanism which can detect some of the problems; however there is no hardware memory protection due to the lack of the MMU in BlackFin. So, a runaway pointer can cause a lot of trouble. That kind of problem is extremely difficult to troubleshoot in the multitask system.

Vladimir Vassilevsky DSP and Mixed Signal Design Consultant

formatting link

Reply to
Vladimir Vassilevsky

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.