Embedded platforms for beginners?

You might look at

formatting link

I've seen the 2011 book and it is great.

Hans Boehm also has some good GC pages.

Reply to
Paul Rubin
Loading thread data ...

I'm not ready to think of the Lisp pioneers who developed and used most of today's GC techniques as "script kiddies". Here's what E. W. Dijkstra said about Lisp in his Turing award lecture, all the way back in 1972:

With a few very basic principles at its foundation, it [LISP] has shown a remarkable stability. Besides that, LISP has been the carrier for a considerable number of in a sense our most sophisticated computer applications. LISP has jokingly been described as ?the most intelligent way to misuse a computer?. I think that description a great compliment because it transmits the full flavour of liberation: it has assisted a number of our most gifted fellow humans in thinking previously impossible thoughts.

There are certainly times when GC imposes unacceptable costs, and maybe those times happen more in embedded programming than in other kinds of programming. But if the hardware can accomodate it and it helps do the job faster, why would anyone not use it?

I worked on a thing a few years ago that had around $50K of software NRE (Python) and used $300 worth of hardware including two ARM processors. By writing it in C maybe we could have spent $10 less per hardware unit, but at least 3x more on programming, plus the time-to-market delay, loss of feature flexibility, etc. We'd have to ship quite a lot of units to even think of switching to that approach. And I think it would have been silly to use C from the beginning, instead of using it for an eventual cost reduction once we had gotten experience from multiple generations of fielded units programmed in Python.

Obviously not all situations are like that. But I think it is getting more and more typical.

Reply to
Paul Rubin

Exactly, when you are talking to people who already know what you are talking about.

--

Rick C 

Viewed the eclipse at Wintercrest Farms, 
on the centerline of totality since 1998
Reply to
rickman

Most people would just Google "memory gc" to educate themselves, rather than bleating about how ignorant they are. "Garbage Collection" is the very first hit on Google. But instead of you spending 30 seconds learning something, a hundred or more people have had to spend 30 seconds reading unnecessary posts.

Clifford Heath.

Reply to
Clifford Heath

This isn't a thread about "most projects" but about how a beginner can learn to program embedded devices. It's pretty important to learn *what memory is* before you learn to let the tools manage it for you.

I've seen smart graduate software engineers who've only ever used IDEs and GC languages who *don't really know* what memory is. I find that absolutely alarming. A good feel for the underlying hardware is essential to good results.

Clifford Heath.

Reply to
Clifford Heath

I suspect that type of application uses mostly static memory, so GC wouldn't help that much even if it's available and you trust it. It's invaluable when you have lots of dynamically allocated objects moving around through the system. And yes, programs like that can have very high reliability, such as with phone switches programmed in Erlang with 40 year uptimes.

Reply to
Paul Rubin

Erlang is designed to fault without failing, with close to zero overhead. It's an entire world apart from most other kinds of programming. If it works for you, great!

Reply to
Clifford Heath

I'm just a dabbler in embedded devices, but on bigger computers it stopped being that way years ago. I work with smart productive programmers every day who do all their stuff in Python and Ruby and have never seen C code. Do I find it alarming? Well, it took me a little while to get used to. But they deliver working products that do what the customer wanted, and what else in engineering is supposed to matter?

What I'm coming to believe (your judgment may well be better than mine, at least if you've done it both ways) is that it's gotten the same way with embedded devices. A coin-size MCU board that appears as a USB flash drive where you drag and drop a Python program sounds about as beginner-friendly as it gets. Another approach (recommended by Rickman) is Forth, which in some ways is similar in spirit. But the value of a self-contained system needing no extra boards and stuff is very high.

Reply to
Paul Rubin

Exactly. I see dozens of un-useful abbreviations in newsgroups every day. They waste people's time. In this case it wasn't just mine. The rule for use of abbreviations in documents is to explain the abbreviation the first time it is used in a doc. It would be a good rule for usage in a thread.

--

Rick C 

Viewed the eclipse at Wintercrest Farms, 
on the centerline of totality since 1998
Reply to
rickman

This industry is characterized by lots of jargon and acronyms. The most desirable quality in selecting a new hire is that they show a propensity and enthusiasm for finding things out by themselves, rather than asking other people.

By the way, how's the job hunt going? Keep going, you're sure to make it some year.

Reply to
Clifford Heath

A big-ass phone switch is _not_ my idea of an embedded system.

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs 
Principal Consultant 
ElectroOptical Innovations LLC 
Optics, Electro-optics, Photonics, Analog Electronics 

160 North State Road #203 
Briarcliff Manor NY 10510 

hobbs at electrooptical dot net 
http://electrooptical.net
Reply to
Phil Hobbs

Obviously you don't understand newsgroups if you think this is a job interview. This is a discussion group. Many people come from different backgrounds with different levels of familiarity of the jargon of any particular micro-area of knowledge. If you want to be able to communicate you won't express every thought with the minimum amount of keystrokes. The term is garbage collection. It's not that had to type. Just like *many* other abbreviations it is often not worth using since it just isn't that much harder to type the name.

--

Rick C 

Viewed the eclipse at Wintercrest Farms, 
on the centerline of totality since 1998
Reply to
rickman

No disagreement there, but the boring stuff is becoming more commonplace. By "boring" I'm thinking of fancy bitmapped GUIs and USB/networking/etc, and at GC is more likely to creep in through that kind of bling.

As for MCU/memory not being a cost driver, that means they no longer preclude GC.

Reply to
Tom Gardner

Typical "definitions" of "embedded system" run along the lines of "An embedded system is a computer system with a dedicated function within a larger mechanical or electrical system, often with real-time computing constraints."

So it is reasonable to argue that telco stuff /is/ an embedded system. Doubly so when you look at how bits of software from different companies are cobbled and patch-panneled together.

Reply to
Tom Gardner

Garbage collection certainly has its advantages - but equally it is not a magic wand that solves all memory problems, nor is it the right tool in all situations.

True.

And we also tend to avoid dynamic memory as much as possible!

On PC's, I usually write in Python - and the automatic memory management is one of the nice things about the language. In that context, I absolutely agree that automatic memory management and garbage collection is the right way to go.

My point is just that there are other situations, other types of programming - and other solutions. The website there was written like a marketing campaign rather than an unbiased reference. As well as dismissing C-style memory management out of hand, it described C++'s methods as "even harder" with no basis whatsoever (RAII and synchronous destructors makes memory management far easier) and pretty much ignored smart pointers altogether.

It also ignores the problems of garbage collection. In particular, it is very easy to get circular references in complex structures, and it is easy to get memory leaks in garbage collected systems. Garbage collection makes the issues of memory management different - it is not a magic fix-all solution. It lets you avoid the micro-managing and write simpler code, but it does not fix everything. It just teaches a generation of programmers that it does not matter if your Python program leaks 100K memory an hour - that's only a GB a year, and everyone has gigabytes to spare.

They do indeed - but that is a process that is mostly independent of garbage collection. The only direct connection is that by making your clean-up happen asynchronously and behind the scenes, you have lost control of where your tidy-up is done. With C++, your file will be closed by the file object's destructor. That means your code that /uses/ the file object does not need to have the close file logic, but you know the file will be properly closed when the using function exits by return or exception. With a garbage collected language, you have the choice of manually putting in the "close" call in your using function, carefully catering for try/except/finally blocks to make sure it is handled in exceptions, or you put it in the object's finaliser to be run eventually, at an unknown time in an unknown order.

For Python (as the gc'ed language I use most), you just assume it will all be tidied up sooner or later. It works because garbage collection runs regularly so it won't be /too/ long before the destructor is called.

I certainly think that most people who program in C, should be using other languages. And most programs that are written in C, would be better written in other languages. (Embedded development is major exception, though even here there are alternatives.) And the memory management in C is a big reason for that opinion.

But telling people that garbage collection is always the best choice and that it solves all memory and resource problems, is not the answer. It's like saying if you buy a car with automatic gears, collision detection with automatic breaking, etc., then you don't need to learn to drive properly.

Reply to
David Brown

Agreed.

Anyone working in C with malloc/free for code that does a lot of string work, is working in the wrong language. Even though you can hide the malloc's and frees in wrappers that make it easier to track, it is still immensely painful - at best, you are using function-call syntax instead of infix operators and writing code that is hard to follow.

Garbage collection is /not/ needed for this - though it is certainly one way to do it. C++'s model works fine: #include #include #include

using namespace std::string_literals;

int main(void) { auto s = "Equation: "s;

std::cout

Reply to
David Brown

The primary thing that precludes GC in my embedded projects is me.

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs 
Principal Consultant 
ElectroOptical Innovations LLC 
Optics, Electro-optics, Photonics, Analog Electronics 

160 North State Road #203 
Briarcliff Manor NY 10510 

hobbs at electrooptical dot net 
http://electrooptical.net
Reply to
Phil Hobbs

And that's an incontestable reason :)

Personally I've never included GC in my embedded projects (although I use it everywhere else). But my reason is different: I'm interested in the /hard/ realtime behaviour.

That frequently makes it difficult for me to include things like caches and interrupts. Fortunately there /are/ alternatives.

Reply to
Tom Gardner

My main reasoning for not using garbage collection, is that my code does not generate garbage :-)

Usually in that kind of project, you are avoiding dynamic memory and not using any kind of malloc/free equivalent.

Reply to
David Brown

Lucky you; usually my code /is/ garbage :)

Agreed, although I don't object to using malloc during startup.

I've always liked (and usually been able to achieve) almost complete separation between the hard realtime code and the "other stuff". Frequently the "other stuff" is coded in whatever is convenient, which usually includes GC.

Reply to
Tom Gardner

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.