"modern" C++ and microcontrollers: match made in heaven?

"Conventional wisdom" says that one should always stick to languages like C and ASM when writing code for "resource constrained" processors like the AVR and el cheapo ARMs.

My somewhat-educated opinion is that it's not true. At least, not anymore.

Instead of just bloating the code, the constructs available in the recent variants of C++ compilers (C++11 and later) allow you to unleash the full fury of these little processors.

"Modern" C++ offers high levels of abstraction and straightforward resource management, while remaining a performance beast.

Reply to
bitrex
Loading thread data ...

Well, if you admire abstraction. In an embedded application, I'd rather have everything in plain sight.

--

John Larkin         Highland Technology, Inc 

lunatic fringe electronics
Reply to
John Larkin

I have to wonder... is bitrex as unskilled at programming as he is at small-signal AC analysis ?>:-} ...Jim Thompson

--
| James E.Thompson                                 |    mens     | 
| Analog Innovations                               |     et      | 
| Analog/Mixed-Signal ASIC's and Discrete Systems  |    manus    | 
| STV, Queen Creek, AZ 85142    Skype: skypeanalog |             | 
| Voice:(480)460-2350  Fax: Available upon request |  Brass Rat  | 
| E-mail Icon at http://www.analog-innovations.com |    1962     |
Reply to
Jim Thompson

Jim, stop being a jerk.

--

John Larkin         Highland Technology, Inc 
picosecond timing   precision measurement  

jlarkin att highlandtechnology dott com 
http://www.highlandtechnology.com
Reply to
John Larkin

You apparently haven't been following the "Interesting negative capacitance circuit" thread ?? ...Jim Thompson

--
| James E.Thompson                                 |    mens     | 
| Analog Innovations                               |     et      | 
| Analog/Mixed-Signal ASIC's and Discrete Systems  |    manus    | 
| STV, Queen Creek, AZ 85142    Skype: skypeanalog |             | 
| Voice:(480)460-2350  Fax: Available upon request |  Brass Rat  | 
| E-mail Icon at http://www.analog-innovations.com |    1962     |
Reply to
Jim Thompson

His prose is a bit overblown, but if what I've read recently is any indication then yes, C++ is getting slim and trim enough at least for

8K/1K flash/ram processors.

A single application that small wouldn't drive me to using C++, but if I had a product line with a lot of code that I wanted to re-use I'd certainly consider C++ in such a situation.

--
Tim Wescott 
Control systems, embedded software and circuit design 
I'm looking for work!  See my website if you're interested 
http://www.wescottdesign.com
Reply to
Tim Wescott

It depends. I just don't think you need C++ in a really small processor

-- C++ is a good tool to manage large, complicated programs; by definition anything that fits into 4K or 8K of flash isn't large.

Moreover, C++ makes it really easy to add functionality, and functionality means more flash used. So you'd be building a slippery slope into your development process.

Having said that, if you had a lot of code that you wanted to easily reuse across products, then C++ on small processors might make sense.

--
Tim Wescott 
Control systems, embedded software and circuit design 
I'm looking for work!  See my website if you're interested 
http://www.wescottdesign.com
Reply to
Tim Wescott

For such a small MCU application, I'd have thought that C++ wouldn't offer sufficient advantages over C.

Reply to
Tom Gardner

The "Tiny" series of AVRs have a max of 8k or so, but I think it's getting to be that the only advantage of those little devices with no hardware multiply, etc. is in applications that require ultra-low power.

A MegaAVR usually has like 32k program memory, and even the dumpiest $2 Cortexs often have 128k.

GCC is pretty tight, you can fit a lot of stuff in there!

Reply to
bitrex

Enter test-driven development: if the code that forms the basis of your abstraction passes all the required unit tests, and can be verified "correct" from the perspective of your chosen static analysis tools, then it's unlikely to be the source of your problems higher up the chain.

Of course I'm not talking about life-critical applications here, I'm sure they have an entirely different set of standards. (which Toyota's engineers for the 2000-something Camry apparently didn't care about.)

Ever taken a look at the codebase for the humble std::cout? Jeez!

Reply to
bitrex

That's kind of my feeling. I go in a progression: bare assembly is just fine if there's 256 bytes or less of flash, C starts looking manditory at around 1K, and you don't really NEED C++ until you get up to 32K or so.

--

Tim Wescott 
Wescott Design Services 
http://www.wescottdesign.com 

I'm looking for work -- see my website!
Reply to
Tim Wescott

The last time I looked was about a year ago. At that point the sub-$1 Cortexes had around 8K of flash or less.

And, if you only need to do a few things it doesn't matter how much flash there is -- you don't have to fill it.

--

Tim Wescott 
Wescott Design Services 
http://www.wescottdesign.com 

I'm looking for work -- see my website!
Reply to
Tim Wescott

It would be an interesting exercise to have a spec for some fairly straightforward, but non-trivial requirement for 8k 8 bit uP and implement it in both C99 and "modern C++" and see what the differences are.

Reply to
bitrex

The Arduino development environment makes it very attractive to write C programs, rather than use assembly language, etc. But even if one's controller program grows to thousands of lines, the full capability of C may be barely put to use.

I wonder, what role does cout have in an environment where there's no conventional screen, a 128x64 bit-mapped OLED display, a font library limiting me to 14 char x 5 lines and a few other display modes, etc.

--
 Thanks, 
    - Win
Reply to
Winfield Hill

The problem isn't the *language* but, rather, the meatware *using* the language.

With (potentially) greater levels of abstraction afforded by a language (tends to) comes an equivalent dissociation of the underlying machine and *its* capabilities -- that's the whole point of greater abstraction!

Folks who step into the problem domain at that higher level of abstraction are at risk for doing things that have significant, often unforeseen (by

*them*!) consequences. With the speed that you can "turn the development crank", now, you can see *some* of the consequences of those decisions ("Gee, I just added one line of code and my binary grew by 20KB!").

But, not always *all*.

A good designer can put a foundation in place that fits the design criteria perfectly -- yet an "average" *programmer* can come along and be lulled by the high levels of abstraction into thinking its a benign development environment; that he *knows* the changes requested (by Manglement) *will* be easy to implement -- until he realizes the subtleties of the original design that make *his* solution a lot less obvious!

E.g., I use recursive algorithms pretty heavily. They're typically more elegant and conceptually more efficient -- less prone to implementation errors.

In a resource constrained environment, these can bite you in the ass when you least expect it. And, potentially only in certain test cases (will The-Next-Guy know how to choose those test cases, appropriately?).

So, I design the algorithms such that the application data inherently limits the depth of recursion and *document* this in the accompanying design documents (with "Here, There be Dragons" in the code commentary).

I can't force The-Next-Guy to be competent... or, diligent... Nor should I have to cripple myself by using a design approach fit for "Joe Average PROGRAMMER" to ensure the product's continued viability. But, Joe Hopefully Competent (though not "stellar") should be cognizant enough of the issues to understand my warning and rationale in the documentation.

Joe ITT Programmer is probably full of himself and thinks he knows all there is to know about C++ (or any other higher level abstraction language) and wouldn't understand the warning if it was tattooed across his knuckles (and thus visible AS he was writing code -- ANY code)!

And, when things didn't work as expected, he'd most probably fall into the "lets try this" method of debugging -- ACTING instead of THINKING ("sooner or later I'll stumble on something that works...")

Reply to
Don Y

You may have looked in the wrong place.

The cheapest Cortex M0 processors with 32kB Flash goes for less than 40 cents in volume

Cheers

Klaus

Reply to
klaus.kragelund

For a small MCU application, C++ needs a more competent programmer than plain old C, as seemingly innocent abstractions can create surprising bloat in the code.

Besides, there should be some kind of heap handling to use objects in a natural way, but IMHO, heap handling is a disaster expecting to happen in a small embedded system. It is much better to see that the thing does not fit in during development phase and not in the field.

--

-TV
Reply to
Tauno Voipio

Of course. But, that's true of every language/platform. How many times have you stumbled across strtok() in some code? How many times have you seen errno examined?? :>

Rewrite the support functions so they write to your "display device" and make whatever "sense" of the character stream that seems most appropriate (e.g., ^G almost always rings an annunciator in my designs -- even if there is no AUDIBLE annunciator!)

Or, use it to "plumb" between processes, etc.

E.g., I've been using a UNIX-ish process model for 35 years wherein each "task/thread" (as I don't often have the luxury of full-fledged processes) is spawned with its own stdin/stdout/stderr descriptors (the parent task mapping them to whatever is appropriate FOR THAT CHILD!).

So, stderr tends to be routed to a black box ("flight recorder") in production and "The Emulator's Console" during development (so I can see a list of diagnostic/error messages on my PC's display while the code is executing). Stdin may be wired to "/dev/keyboard" (even though there's no formal filesystem, I can choose to name the keyboard device something that is suggestive of its function). Stdout might be wired to a UART's Tx channel. etc.

I designed a product with a 7x95 (dot) LCD module. Basically, just a single line of text. And, if you used a boring 7x6 typeface, you were limited to 16 characters.

OTOH, implement a proportional width font and now you can choose your prompts to maximize display effectiveness (lots of lowercase L's and I's take up less space than A's and O's!). Let the display device (i.e., the code that "catches" the character stream sent out stdout) decode control sequences and you can display graphics, make individual "pixels" blink, etc.

And, you can do it with very little code and WITHOUT resorting to cryptic function calls (begin_dot_mode(), blink_on(), move_to(x,y), etc.)

You just need to identify the proper *abstractions* for your application.

Reply to
Don Y

For writing text to a display, why not abstract the OLED display such that you can pipe text to it from an output stream in an equivalent fashion as you would a terminal?

In the fashion of the Arduino environment:

MyDisplay* display;

void setup() { display = new MyDisplay(128, 64, 5, 14); // (height, width, lines, char_per_line) }

loop() { display

Reply to
bitrex

Yes. One must learn to accept generic algorithm techniques, and template metaprogramming.

It used to scare me too. One must experience the power of the Dark Side...

Reply to
bitrex

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.