OO languages

Reply to
Niklas Holsti
Loading thread data ...

Not a lot. Only 600MHz, 64M. In our days, that's nothing. :-) Actually, the OS, filesystem and TCP/IP is not very resource consuming: depending on the number of threads, files, sockets, etc., the system needs are practically in the ~100K range. At minimum, we can run from the BlackFin CPU L1 memory.

Unfortunately, BlackFin only has the rudimentary MMU. The full featured MMU would be very useful; I will certainly consider a CPU with the MMU for the next project like that.

You feel my pain...

The RTOS was written in C++. The multitasking and the hardware abstraction concepts fit nicely with the C++ paradigm; this was one of the arguments for using C++. I was very frustrated with the C call nuisance of mucos-II and ADI VDK before.

A crashed application can very well screw up everything. However, if this happens, we fall into the bootloader, so we can recover.

That's the whole point: making the application programming available for dummies. The OO system is supposed to protect them from themselves.

I.e., the sort of intuitive

From the other hand, the bulk of the programmer's work is nothing more then a legwork; it doesn't have to be done brilliantly; it is just has to work somehow.

Just recently I had to fix the project in C developed by "average Joes". There were the usual C problems with not initializing something, providing not enough of memory for something, and running out of the array size. So I think C++ is the better way to do the things.

Vladimir Vassilevsky DSP and Mixed Signal Design Consultant

formatting link

Reply to
Vladimir Vassilevsky

Personal experience. The language is just too "big" and clumsy. With good discipline, I think you can produce applications that are as reliable as anything developed under Ada *without* that burden.

I liken Ada to automobiles that *force* you to wear a seatbelt prior to starting. I.e., it imposes itself on the developer/application on the assumption that the developer won't Do The Right Thing (deliberately or unintentionally). This might be A Good Thing for defense contracts, very large applications developed by large teams with diverse capabilities and skill levels, etc. But, its a heavy price to pay for "everything" (and, apparently, Industry seems to concur with that).

[I wonder how long it would take to develop the iPhone if it had been written in Ada? :-/ ]

Imagine having to have a person sitting in your automobile so that it will start so that *you* can check the level of the automatic transmission fluid. I.e., its just an inconvenience that gets in your way.

Reply to
D Yuniskis

Not necessarily. C code-generating programs have been around since effectively forever, and so have methods that allow the compiler and other tools to refer all the way back to the real source text. That's what the #line directive was invented for.

Reply to
Hans-Bernhard Bröker

People who are not comfortable with pointers haven't done enough assembly language programming.

--
Gemaakt met Opera's revolutionaire e-mailprogramma:  
http://www.opera.com/mail/
Reply to
Boudewijn Dijkstra

It is a mystery to me as to how recent graduates of Computer Science are vaunted as experts on computers, yet haven't a clue about the actual operation of a computer at the assembly language (or even machine code) level.

Indeed, to understand the XOR subroutine for a PDP8, you not only had to understand assembly language (as it was so coded) but also had to understand the operation of Half and Full Adders. (You did an addition, and then subtracted the Logical And of the two input variables after that Logical And had been left shifted). This was much shorter than evaluating (A and not B) or (B and not A)

Those of us who cut our teeth on assembly language find no difficulty in understanding the concepts inherent in any high level language, even those as arcane as LISP ("Lots of Infernal Stupid Parentheses")

Reply to
Phil O. Sopher

There have been several studies about that, though I don't have specific references.

--
Gemaakt met Opera's revolutionaire e-mailprogramma:  
http://www.opera.com/mail/
Reply to
Boudewijn Dijkstra

...

Hmm... I'll grant you Lisp, which in its basics has a close correspondence to machine-level structures and algorithms, but what about Prolog? There is quite a gap between machine-level concepts and Prolog concepts.

On the other hand, few people seem to be taught Prolog, now, which is a pity. I found it mind-expanding.

--
Niklas Holsti
Tidorum Ltd
niklas holsti tidorum fi
       .      @       .
Reply to
Niklas Holsti

No it doesn't, but that's not relevant to my point which is that by understanding the fundamental operations of a machine it is but a small step to see how those fundamental operations are used to synthesize other machines.

As for Lisp, above.

Reply to
Phil O. Sopher

Computer science is more "fun" when you define how you wish the computer worked, and program that model.

Then someone like Symbolics comes along and makes a computer that actually does work that way.

Then someone like Sun comes along and makes a computer that doesn't work that way, but is enough faster that you can ignore that...

What was interesting was that the other half the lab in which the freshmen were wearing out the () keys of a collection of apollos had juniors hand wiring modules into a 4 bit (or was it 8?) processor, that then software emulated a 32 bit one, at an effective clock rate of, oh, maybe a hz or two. So in a way it eventually did all tie together, but the lack of initial pragmatism was a little shocking to someone who had grown up with small micros.

Reply to
cs_posting

It's been that way for at least 25 years. I remembeber dealing with CS grads back in the mid 80s who had no clue about low-level stuff. They didn't know what a "register" was nor were they aware of how many CPUs used stacks for call/return linkage.

--
Grant Edwards                   grante             Yow! Am I in Milwaukee?
                                  at               
                               visi.com
Reply to
Grant Edwards

So you don't like it, and of course you have that right.

(By the way, was your experience with the original Ada, the 1983 form that is, or the updates from 1995 or 2005? Quite a lot of flexibility and power have been added in the newer forms of the language.)

No doubt. But I like the help I get from Ada to observe that discipline. I'm fallible -- sometimes more, sometimes less, but rarely zero. Doing it in C feels like harder work. In addition, Ada gives me tools that C just doesn't have: modularity and encapsulation with packages, strong separation of interface and implementation for operations *and* data structures, run-time error checking (if I want it).

Well, some industries use Ada, others don't. What was that saying about a billion flies :-)

We will never know. But see below for some indications...

It may feel like that sometimes, but there are several reports that indicate that in the end Ada helps programmers, in several contexts.

For example, John McCormick reports dramatically better results from students in a real-time programming course when they implemented a model-railroad system in Ada than in C. See

formatting link

Pratt-Whitney compared its production of software for military jet engines (initially mandated to use Ada) and for commercial engines (initially using whatever they wanted). The Ada side had twice the productivity and one-fourth the number of errors. See

formatting link
Ok, the commercial side used lots of assembly code. But according to this report, the government side of Pratt-Whitney is staying with Ada, although the "Ada mandate" has been lifted, so they could switch if they wanted to.

Stephen Zeigler compared the effectiveness of C and Ada programming at Rational, a compiler and SW tool business (now part of IBM), as Rational was gradually switching from C to Ada. See

formatting link
Some quotes: "Ada cost almost half of what the C code cost, and contained significantly fewer defects per 1000 SLOC by 7x the C code (700%). Even on the new C++ code, Ada still has 440% fewer defects."

In fairness, there are also studies that suggest that the languages are about equally effective. See

formatting link

YMMV, sure.

--
Niklas Holsti
Tidorum Ltd
niklas holsti tidorum fi
       .      @       .
Reply to
Niklas Holsti

By whom?

Academic CS seems quite happy to accept Dijkstra's maxim: "computer science is no more the study of computers than astronomy is the study of telescopes".

Beyond that, there is a common view that a focus on low-level details can be harmful.

In my experience, there is some justification for that view; e.g. I've seen experienced assembly-language programmers spend hours shaving clock cycles off the inner loop of an O(n^2) algorithm when a naive implementation of an O(n.log(n)) algorithm turned out to be significantly faster.

I've seen the same programmers struggle as the industry has moved from a small number of large projects employing many programmers for years to millions of projects taking a single programmer a week.

Once upon a time, embedded development wasn't all that much different to developing for larger systems. These days, they're completely different fields. The skills required for embedded development are more likely to be taught on an electronic engineering course than computer science.

If you read the electronics groups, you'll find EEs complaining about current EE graduates understanding FPGAs and microcontrollers but not understanding the operation of a simple amplifier.

Reply to
Nobody

Does too :-) Even the basic functions "car" and "cdr" are named after, and originally depended on, the structure of a word or machine instruction in the computer on which the first Lisp was implemented ("car" = "choose address register", "cdr" = "choose data register", if I remember correctly).

I agree that this understanding is important and helpful; even necessary if one wants to fully understand that there is nothing mysterious in the working of the higher-level machines such as the virtual or logical machines that define the semantics of imperative programming languages, even of high level languages. But...

No. The way a Prolog program works is so distant from the way assembly language works that the concepts of one are of no help in understanding the concepts of the other. Of course, in the end a Prolog program is executed by an assembly language (machine) program, but the connection between a given Prolog statement (rule) and the assembly instructions that implement such rules is very distant and not at all as simple as for an imperative language like C/C++/Ada/Lisp. For example, there are no assembly-level conceptc that correspond to a "free logical variable" or "matching of logical clauses", not even roughly.

--
Niklas Holsti
Tidorum Ltd
niklas holsti tidorum fi
       .      @       .
Reply to
Niklas Holsti

Ah, I had assumed you also had magnetic media (for the file system and, by extension, VM). My bad.

I'm considerably over that limit. The OS provides lots of services -- and supports several "critical" services for applications. So, its got a pretty heavy footprint. (e.g., HRT guarantees, VM support, tight and loosely coupled multiprocessing, etc.)

Yes. I am a firm believer in this as it probably does more to "help" the developer than most language features! Of course, there are costs associated with its use but they are easily (IMO) outweighed by the extra tricks you can play...

*Exactly*! This is the "draw" (for me) to using C++ (or similar OO). It is just *so* much more elegant to be able to deal with OS constructs as tangible objects than as just "handles", etc. And, being able to augment their definitions with extra cruft to facilitate debugging (e.g., have a CString in the DEBUG version of each object that lets you tag the object's instantiation with something descriptive that you can later use the debugger to inspect.)

The OS itself is object oriented but not implemented in an OO language. :(

But there is nothing that *detects* that something has gone awry? I.e., you rely on the errant application to clobber "something" that *eventually* causes the system to crash? (perhaps a watchdog brings you back to sanity?)

At one level, it *does* (e.g., uninitialized variables, walking off the end of arrays, etc. -- assuming a well defined set of classes). But, on other levels, it can bring with it all sorts of invisible overhead that might be hard for someone not intimately familiar with the language to pick up on. (e.g., I am *constantly* startled by the presence of anonymous objects that materialize in my code -- albeit of transitory nature. As objects get "heavier" (e.g., adding debug support), each one of these that the compiler creates starts to hammer on memory...

I think that is true of work in a desktop environment. There are no *typical* time constraints nor resource constraints. If the application crashes, the user can "try again". You (or the OS) can provide feedback to the user in the form of dialog boxes, log messages, etc.

But, in an embedded system, the application often *must* work. It may be unattended (no one there to push the reset button) or perform some critical role, etc. Your user I/O may be seriously constrained so conversing with the user may be difficult or impractical (especially if the user isn't *there*!).

Again, I think you trade one set of problems for another. I'm just trying to figure out where the "least pain" lies :<

Reply to
D Yuniskis

Huh? 440% fewer defects? Doesn't "100% fewer defects" mean zero defects? What does 440% fewer mean?

--
Grant Edwards                   grante             Yow! I want to read my new
                                  at               poem about pork brains and
                               visi.com            outer space ...
Reply to
Grant Edwards

Dammit, Jim! I'm a computer scientist not a mathematician!

:-)

Reply to
Gil Hamilton

And there's nothing wrong with that until "academic CS" graduates start accepting jobs that require computer expertise. Not that the people hiring them don't deserver a lion's share of the blame.

--
Grant Edwards                   grante             Yow! Where's the Coke
                                  at               machine?  Tell me a joke!!
                               visi.com
Reply to
Grant Edwards

I think this is a consequence of these folks being trained as "programmers" instead of as "engineers". E.g., my background is as an EE where "logic elements" and "processors" were just building blocks like "transistors" and "rectifiers".

I.e., you can visualize how an op amp is "just" a bunch of Q's, R's, D's, etc. in a miniaturized form. OTOH, if your exposure to electronics was at the op amp level, someone had to go to *extra* lengths to show you what was "inside". Without that extra exposure, you simply were unaware of how the devices were built and, as a result, how they fundamentally operated as well as the reasons behind their limitations, etc.

However (playing advocate, devil's), many people fail to "get out of the mud" and rise to use these levels of detail. Or, cling to their special knowledge of these low level intricacies at the expense of benefiting from higher level abstractions.

E.g., writing an OS in ASM nowadays (for all but trivial processors) is a self-indulgent waste of time. Comparable to debugging in

*hex* (instead of using a symbolic debugger).
Reply to
D Yuniskis

Try designing with *analog* computers (a project from my high school days) :-/

Reply to
D Yuniskis

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.