OO languages

Hi,

I've revisited this subject (and question) far too many times in my career -- yet, I keep *hoping* the answer might change (sure sign of insanity? :> )

I have a largish application that *really* would benefit (from the standpoint of maintenance) from the use of something like C++. Currently, I use an OO programming *style* but write the code entirely in C. I just find C++ too damn difficult to keep track of all the magic that goes on behind the scenes so getting deterministic behavior (RT application) is much easier from C (i.e., I can more accurately visualize what the machine is doing each time I write a statement -- without having to worry about whether an anonymous object is being created as a side-effect of a statement).

It also makes it much easier for me to keep track of physical resources (it seems like I'd be constantly overloading new() in a C++ implementation just to make sure I can corral each wayward instance into a known part of memory, etc.)

But, doing things in such a "low level" language (C) will ultimately make the code harder to maintain for those that follow me. It really would be nice if I could leverage some OO features to relax some of the discipline that I have had to impose on the code doing "OO under C".

I am sorely tempted to go back and rewrite the OS itself, as a first exercise, just to see how damaging this venture might become (in terms of code size, runtime performance and reliability). But, that's a fair bit of work (the image is a bit over 500K currently) and I'd hate to undertake it naively.

Are there any folks who have successfully deployed larger applications in an OO language? No, I'm not talking about desktop apps where the user can reboot when the system gets munged. I'm working in a 365/24/7 environment so things "just HAVE to work".

Any tips you can share that can help me get C-like behavior from a C++-like implementation? (besides the obvious: "use only the C subset of C++" :> )

Thanks!

Reply to
D Yuniskis
Loading thread data ...
[...]

We have a relatively big embedded system (several M of the source code, filesystem, TCP/IP, RTOS, etc.) developed by a team of five programmers. This system works in the field, 24/7, unattended. At the very beginning, there was the usual trivial argument about C vs C++, and it was decided to use C++. Now I can say that was a wise choice; it would be difficult to tackle the complexity in C.

"There is no and there can't be any substitute for the intelligence, experience, common sense and good taste" (Stroustrup).

Vladimir Vassilevsky DSP and Mixed Signal Design Consultant

formatting link

Reply to
Vladimir Vassilevsky

A thought: Have you considered who is likely to follow you, and how they will be recruited? Consider that the job posting will (if you move to C++) likely say "Mandatory 5+ years C++ programming experience in an embedded environment". Ponder the fact that the qualifier "in an embedded environment" might just mean writing trivial UI code for some product that's a PC-in-a-box.

Reply to
larwe

Personally, what I dislike is having to mention "objects" in multiple places. What I want is a system where I can instantiate things inline as I need them ("make me a menu with these choices"), but have all the allocations be predetermined during compilation (no runtime memory management surprises) and summarized for me.

The android phone I've been playing with comes close... pseudo-java with an active garbage collector...

Reply to
Chris Stratton

I was software lead for a series of projects that brought my then- employer into using C++ for largish code bases. They have been quite successful, although there have been rough edges, too.

Tips for success?

  • You're already aware of the heap problem. I solve this not so much by avoiding 'new' as by _never_ allocating off of the heap inside a task loop. I'll allocate off the heap in start up code (i.e. use 'new', but never 'delete'), I'll allocate off the heap in 'tweak and tune' code (i.e. code that'll be used by engineering, service and manufacturing personnel, but never by a customer in normal operation), and if pressed I'll make a pool of blocks and overload 'new' -- but I do this latter only rarely. Of course, you can only do this if you can allocate all required memory statically -- but if you can't, you probably don't have enough memory anyway.

  • Don't reuse code too early. The company has a great big pile of very useful reusable code, all of which was written the second or third time around in it's current form. Reusable code is an art that requires you to capture just the right feature sets in your interfaces, and put deep thought into just what belongs where. It's not for everyone, but if you can do it you can _really_ leverage it to your advantage.

  • Most embedded C++ compilers have gotten pretty efficient with core language constructs (I'm told that even exception handling isn't so bad anymore, but I haven't tried it). _Don't_ use the standard libraries with anything but extreme caution, however -- if you want to find yourself pulling in everything including at least ten kitchen sinks, start using the STL in your embedded code.

  • Beware C programmers masquerading as C++ programmers, and beware desktop programmers masquerading as embedded programmers. When we were doing this we were embedded folks teaching ourselves C++, and we had much better success taking good, 'virgin' embedded C programmers and tossing them at C++ -- the "it looks like C but compiles when you name it .cpp" crowd is bad, and the "what do you mean I don't have virtual memory?" crowd is worse.

  • If you can, start small. Just use the basic OO features of C++, possibly in just a portion of your code. Grow from there. Trying to do a giant project at the same time that you learn C++ will be awkward -- although if you're already using OO design techniques it may not be as bad as it sounds.
--
www.wescottdesign.com
Reply to
Tim Wescott

The Linux kernel also uses an OO programming style in C, and it works quite well. Maybe you can look at some of the design ideas (if you're not already familiar with them).

Reply to
Arlet

Beware, here be dragons. One of the more successfully disguised ones is: how do you make sure you don't run out of stack at run-time, if basically all function calls go through run-time evaluated function pointers?

Now that'll need some explaining. If you've used static stack size determination tools before, you'll have noticed that basically none of them can follow calls made via function pointers or recursive ones (and ultimately, it's provably impossible to do so anyway). But in bona-fide OO code every other function calls gets dispatched via some object's method table (a.k.a. "vtable"), i.e. deep down it's a function pointer.

So from the point-of-view of static analysis, your whole OO program falls apart into many disconnected shreds of call tree, and there's no way left to put it all back together.

Reply to
Hans-Bernhard Bröker

e

At theoretically, the problem is not nearly as bad as unbounded function pointers. A virtually dispatched function will come from a precisely defined set of possibilities - the classed derived from the base in question, and thus is no worse than a switch statement selecting a particular function to call (which obviously is amenable to worst case stack depth analysis).

Obviously the ability of a particular tool to follow that construct through the entire code base is an issue.

And it's only an issue with (base) classes with virtual functions, which does limit the scope somewhat.

Reply to
robertwessel2

The answer *will* eventually change, because the environment (the number and popularity of languages) is changing.

I like to keep a mental distinction between design and implementation, and I don't find myself too limited by designing in an object-oriented style and coding in C. Well, that is there was a time when I didn't. More recently I've been doing some higher-level projects and using higher- level, more dynamic languages, and I have to say, they certainly have some appeal. Have you completely discarded the possibility of using more than one language for your project? Just as in days of yore it was common to code the high-level parts in C and fall back to assembly language for the tricky parts, I find it pretty effective to code the high level parts in C++/Java/scheme/whatever and fall back to C for the peices that I really have to be certain about. All of the "higher level", OO languages have good ways to interact with C code.

Consider that if your "OO under C" is making your C too complicated, then you are probably going into one of those areas where your C++ or whatever would also be difficult to reason about or understand.

I know of at least two RTOSes that have been written in C++ (OK-L4 and eCos), and there have been a few in the past in Modula-3 and Java, so even automatic garbage collection can be managed for some systems.

When a similar discussion came up on Comp.arch last week, there was quite a bit of vocal support for both contemporary Ada and D, both of which are on my list to try. I suspect that I'll be getting to D first, because I've been reading about a project on FreeBSD that will be using it...

Like Tim said: watch out for allocation and avoid deallocation, and be wary of exceptions. To that I'd add: be gentle with the operator overloading: "cool" is the enemy of understandable. That still leaves you with first class inheritance and polymorphism, which are IMO the big check-boxes that you're looking for, to support object oriented design (and which aren't easy to do "nicely" in C).

Cheers,

--
Andrew
Reply to
Andrew Reilly

"Minos - The design and implementation of an embedded real-time operating system with a perspective of fault tolerance" (IMCSIT 2008):

formatting link

Minos was written in Oberon-07. Oberon-07 hasn't got the full extent of OO features as its cousin Oberon-2. However, if gives you the same level of control as C without the associated risks.

-- Chris Burrows CFB Software Armaide v2.1: ARM Oberon-07 Development System

formatting link

Reply to
Chris Burrows

-- snip --

I _only_ use operator overloading for well-defined arithmetic types (i.e. matrices & vectors), as already done in the STL (i.e. the '>>' operator for input and '

Reply to
Tim Wescott

With a filesystem, I am assuming (not a "given") that you have lots of resources available (?). E.g., does the OS support VM? Or, are all of the tasks (processes) running on it known to have bounded resource requirements?

Was the OS written in C++ or just the applications?

Does the OS have provisions to detect (and recover from) crashed applications? Or, does a crashed application bring the system to its knees?

Agreed. But, the problem (IMO) with C++ (or other 4G languages) is that it is often hard to find folks who *really* know what's happening "under the hood". I.e., the sort of intuitive understanding of exactly what the compiler will generate for

*any* arbitrary code fragment.

E.g., I have been writing in C++ for many years now and I am constantly surprised by things that happen "unexpectedly". There's just way too many little things that go on that catch me off guard. If I am writing for a desktop environment, I can usually shrug and deal with it. But, when I have a fixed, tightly constrained set of resources to work within (TEXT, DATA and "time"), these "judgment lapses" quickly get out of hand. :<

(sigh) But said by someone working for a firm with lots of re$ource$ to devote to staff, etc. Things seem to be considerably different in the "real world" (I am continually disappointed with the caliber of the "programmers" I meet... "just get it done" seems to be their mantra -- note that "right" is not part of that! :< )

It is exactly this problem that has me vacillating about whether a "highly structured C approach" would be better or worse than doing it in C++ (or other 4G HLL). I.e., which are "average Joes" least likely to screw up? :-/

Reply to
D Yuniskis

Exactly. But, the same can apply to a C implementation. People seem to think (or, perhaps, just *claim*) they know more than they do about language particulars. I am inherently leary of anyone who doesn't have a hardware background and hasn't spent a fair bit of time writing assembly language code -- just so they have a good feel for what the "machine" really is, can do, etc. But, hiring often gives way to expediencies under the assumption that "what they don't know, they can *learn*"...

Reply to
D Yuniskis

Well, sometimes that just isn't possible. Or, at least not easily automated (especially if you are resource constrained). Memory management is always a tricky subject. At upper application levels, you can afford (somewhat) to let the application deal with memory problems (running out of heap, etc.). At lower levels (e.g., within the OS), you often have to "guarantee" that "/* Can't Happen */" REALLY CAN'T HAPPEN!

I think anything that relies on GC will bite me as it is hard to get deterministic performance when you don't know how/when the GC will come along. Hence my suggestion that new() be heavily overloaded (probably relying on lots of "buffer pools" for the corresponding objects) just to make sure "automatic allocation" can work in a reasonably deterministic fashion (like purely static allocation)

But, I've had other folks mention android to me so it is probably worth poking under the hood...

Reply to
D Yuniskis

The problem that I see with C++ is that it really *likes* to create anonymous objects, etc. So, it goes to the heap often (unless you are very careful and manually create each object that will be used in a computation/operation and later destroy it; but, this just makes the allocation more obvious -- it doesn't prevent it!).

Since resources are bounded, I have to think carefully about what memory needs will be, *who* will need them and *when*. Often, this lets me reuse memory regions among mutually exclusive "tasks" (generic sense of the word) for an overall economy.

E.g., I cheat and redirect file system operations directly to the actual (solid state) media (e.g., SD card, CF, etc.) instead of pulling the data into "real" memory. It takes a performance hit in some cases but eliminates that otherwise redundant copy of the "file data".

I don't worry about reusing code. I spend more time concentrating on reusing *algorithms*. Its hard to reuse code when you may be writing for different languages, different machines, etc. OTOH, (re)using an *approach* to a problem saves you all the engineering time (coding a known algorithm usually is pretty trivial).

Yes, I have found the STL to be incredibly *un*useful. Its too generic and too "fat" for most applications (perhaps desktop applications can afford this?)

Exactly! And, without meaning any disrespect to any of those folks, often they imply don't *know* the differences (i.e., what they *don't* know)

Yes. There is a very different mindset between the different camps. Desktop programmers tend to do things "easier" (for themselves) and run through lots of resources that just aren't available in an embedded environment. And, they probably have never actually looked at how well their code performs (e.g., what is your maximum stack penetration? where are your timing bottlenecks? how many pageins/outs are happening: under light load vs. heavy load, etc.)

You can often catch these folks with trivial "tests" that highlight some aspect of a language (or "operating environment") that they might not be aware of. For example:

for (i = 0; i < MAXI; i++) for (j = 0; j < MAXJ; j++) foo[i][j] =

vs. the "identical":

for (j = 0; j < MAXJ; j++) for (i = 0; i < MAXI; i++) foo[i][j] =

*Learning* C++ isn't the issue. The issue is deciding whether the risks (of subsequent development/maintenance efforts) of using *it* are greater than the risks of using highly structured C.

There are many cases where I have found that C++ *looks* like the right way to approach a problem but the performance issues just make it incredibly WRONG for that problem. Just looking at the C++ code makes that solution (e.g.) very tempting -- until you actually profile it and compare it to an equivalent algorithm in C.

E.g., I wrote a gesture recognizer in C++. It was *slick*. But, almost twice the size and three times slower than the identical algorithm implemented in C. End result: figure out how to gain the structure advantages of C++ without dragging in all of its *cost*. :-(

Reply to
D Yuniskis

Ah! I wasn't aware of that! Thanks! I *think* I have a good approach (in my "highly structured C" approach). Things *work* well. I'm just afraid others won't be able to keep up that discipline or understand fully what is going on (e.g., lots of function pointers -- people seem to not like pointers... let alone pointers to functions! :< )

Reply to
D Yuniskis

I only have to "worry" about those issues within the OS. An application can run out of stack and the OS will compensate. It will allocate additional stack space as required. And, if it can't do this, it will stop the offending application and notify it of the problem. It can then be restarted with a more generous resource request (if *that* is too much for the system to accommodate, then the application isn't started, etc.)

I am particularly fond of recursive algorithms so I am used to having to manually analyze stack penetration (since there is no way to really tell the "application" what types of input it *will* encounter). In fact, I have another post pending that tries to address bounding what would otherwise be an automatic variable's allocation (to get deterministic performance from the algorithm in the face of unbounded input)

Reply to
D Yuniskis

Exactly. But, I find most C programmers find this difficult to grasp. I.e., it *looks* (to them) like lots of extra complexity and "machinery" -- especially because *they* must do all of the things that the C++ compiler would have done. You get this, "why bother?" look from them...

There are several languages at play in the product. At the highest levels, I use much more modern approaches to problems -- but, the problems are usually much "simpler" (e.g., the OS and the services that are directly bundled with it is, by far, the most complicated piece of code). E.g., I can compute pi to any arbitrary precision in a few lines of code. OTOH, guaranteeing that a TCP/IP connection is serviced at the right relative priority wrt other active connections takes considerably more code! :>

Exactly. I think a C++ guru *might* be willing to instinctively code "appropriately" for this environment. But, I suspect the garden variety "C++ coder" is likely to be oblivious to his errors. Possibly *through* development, production and *deployment*! :<

I wonder if I'm just stuck with a "you really MUST hire talented people" problem (in which case, does it really *matter* which approach you take?)

eCos is tiny. I have no familiarity with OK-L4 (is it the successor to "L3"? I don't know what the "OK" means...)

Stay away from Ada. It *really* doesn't seem worth the effort! :<

Inheritance is *way* too heavy a burden on the developer (under C). You just have to remember to do to much "manually".

In looking at the OS itself (my first candidate for rewrite), I don't think I would gain much/anything using inheritance. The objects are too "orthogonal". While there might be *one* common base class that applied to many/all of them, this could easily be implemented through "discipline".

Reply to
D Yuniskis

Thanks! I will look into that!

Reply to
D Yuniskis

e

Actually what I want to do is not very complicated. The end result I want can be achieved by manually typing the right things in the right places in the source files. What I dislike is that when I add say a menu to my user interface, then I have to go back to a different place in the file and list it's choices. I've been considering writing a sort of pre-preprocessor to take find these "as needed" mentions and move them to a place where the compiler will tolerate... but then compiler error locations would be off, and source-level debugging not reflect the editable sources...

Not sure if I can blame that or not, but it crashed when my alarm clock went off this morning after about a week and a half of uptime... fortunately the "alarm" that continued to sound was a decent audio track

Reply to
Chris Stratton

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.