Embedded platforms for beginners?

Well, given that dynamic allocation has no place in small systems either, GC is a bit of a non-issue. ;)

Until it's been running long enough that the heap is fragmented into tiny, tiny bits.

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs 
Principal Consultant 
 Click to see the full signature
Reply to
Phil Hobbs
Loading thread data ...

Modern generational GC compacts the heap quite effectively. The only real difficulties are (a) doing that unobtrusively and (b) being certain that you won't ever run out of memory due to leaks.

Reply to
Clifford Heath

I assume that you haven't got a lot of aliased pointers running round your code!

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs 
Principal Consultant 
 Click to see the full signature
Reply to
Phil Hobbs

Aren't #2-#4 the same with Arduino and Mbed? Does that super low level stuff really matter for most embedded programming any more? Does OP have reasonable experience at desktop-level programming? I'd say if not, then embedded in general isn't a good place to start. Everything is easier on desktop systems.

Reply to
Paul Rubin

Generational GC relies on being able to move things between heaps, so object handles cannot be pointers (or rather, must be indirect). Languages that do it don't allow you to see the pointer as a number. ... another reason why you can't get the most out of the hardware.

Reply to
Clifford Heath

At least among language geeks, it's definitely commonly used.

Reply to
Paul Rubin

The handles are usually still pointers. First you copy the object to the other heap, then later you scan it, copy the objects that the first objects points to, and update the pointers in the first object to point to the new locations. This is described pretty well in the SICP chapter on metacircular evaluators (mitpress.mit.edu/sicp).

Reply to
Paul Rubin

MicroPython works with some contortions on the BBC Micro:bit which has

16k of ram, and much more nicely on the recent Adafruit SAMD21 boards which have 32k of ram. I don't know how much Luanode uses.

Camelforth/4e4th (non-gc) runs ok on the MSP430G2553 which has 16k of flash and 512 bytes of ram. That might be a good way to start, or maybe Amforth on the AVR-based Arduinos. Who really cares about these any more though, except for legacy products? Everything is 32 bit now.

I bought an Adafruit Gemma M0 board last week and had hoped to try it out this weekend, but have been distracted by other things. Real soon now though. If it's what it's cracked up to be, it basically puts the nail into the Arduino.

Reply to
Paul Rubin

Lots of things could do that. None of them have. The problem is, too many could; which to choose? I'm not supporting Arduino, I'm just saying that the best solution does not necessarily win mindshare. The hardware is almost immaterial; most people go where there's community.

Reply to
Clifford Heath

Paul,

Please, why don't you copy/paste all the content from

formatting link
There might be some people here who have not read it all yet. Sigh... Yes, there is a whole domain of science here. Far, far more than you have reminded us of yet.

No, it's *still* not the way to get the best from most embedded hardware. It's an acceptable compromise, some of the time, is all that can really be said.

Reply to
Clifford Heath

"Small" is getting larger all the time :)

Nowadays it isn't completely ridiculous to consider having GC on a small system - although I've never done it.

If you roll your own general purpose GC then at best you will be reinventing a wheel. In almost all cases such a wheel will be /far/ from circular. Even special purpose GCs can be a pig to get working.

C/C++ as a language is, of course, a very bad starting point for a general purpose GC. Boehm made heroic efforts, and do to having to make pessimising assumptions, only managed to get it "frequently correct".

Reply to
Tom Gardner

That is /one/ way to handle it. There are many others.

For example, when you have pools of memory that have blocks of the same size, you might use an array and have indexes into that array as your handles.

Or you might have something that is basically an address, but specifically manipulated in order to make it easily distinguishable to a garbage collector, and to avoid user code accidentally using it as a pointer. This could be as simple as adding an offset to put the value outside the valid range in the device - software bugs would then cause hard faults that can quickly be identified.

Reply to
David Brown

I had a little look at that site. It had some useful explanations, but is /seriously/ biased. It gives the impression that the only reason you would /not/ want garbage collection is if you want your program to be bloated, slow, inefficient, waste memory - and leak resources, have buffer overflows and other nasal daemons.

Reply to
David Brown

Phil,

I agree, but the question is this is really such an issue for a beginner.

Let's keep in mind that the arduino project started out in an university of art to allow people to create interactive art. (like a painting that changes color when you move your hand over it).

For that purpose, -as this is a tool to be used by artists, not engineers, its main goal is to hide the complexity of the low-level hardware interfaceing in libraries, ... it does do that very well.

I agree, once you start writing code that interrupt service routines, timers, DMA-channels, etc. doing debugging with just print-commands soons hits its limits, ... but that's not what a beginner does, it it?

My advice usually is, start with one of these arduino-starter kits to get your feet wet, and -once you have done that- you can then descide on what your next step will be: native AVR + gcc, mbed, STM32 + libopencm3, LPCxpresso (indeed, also very nice), PIC, MSP430, ...

Or you can go "up" in the hardware stack and opt for (say) micropython an a STM32F4 (I think, the ideal tool to do fast-development talking to some new SPI/i2c/can-bus chip) or go down in the hardware stack: FPGAs/VHDL/verilog/myhdl/...

Cheerio! Kr. Bonne.

Reply to
kristoff

Unfortunately, the pro-GC bias is one that is firmly rooted in reality.

In this forum we get a distorted view because the general skill level of embedded developers is relatively high.

In the broader programming world, the average skill level is just slightly above "script kiddy". The headlong march to "managed" languages such as Java, C#, Python, etc. is industry acknowledging that the average programmer simply can't write a complex program without hand-holding.

And it isn't just memory management ... the runtimes of these languages also - in whole or in part - manage files, network connections, threads, synchronize object access, etc.

C isn't even being taught in a lot of schools now, so even those who have a (relatively) recent CS/IS degree may have no experience of programming that requires manual resource management.

All in all, general software development is a sorry state of affairs that is getting worse with every passing year. I have stated in the past my belief that the majority of software "developers" would be doing the world a favor by finding another occupation.

YMMV, George

Reply to
George Neuner

I used to write a lot of assembly. Then I wrote a lot of C. These days I write a lot of Python. And you know what? I get more done faster with fewer bugs.

Just the same as going from ASM to C, there are things that the machine is just better at doing than you are. Here and there, yeah, I can sit there and hand-tweak better ASM than the C compiler will give me. But for an entire project of any scope? Better to let the optimizing compiler do 90% of the job over 100% of the project than for me to do

100% of the job over only 20% of the project.

GC is the same thing. Think how complicated the simple function "Create a new string from the concatenation of two existing strings." becomes when you start having worry about malloc/free. Either: A) the function has to accept a pre-allocated buffer, in which case you've moved all the complexity of adding strlens and not getting off by one out of the function, making the function pointless, or B) the function does its own malloc, which you've now hidden under the hood while still forcing the caller to remember to free it at some point.

With GC if you need the new string, you magically create it on the heap, and then you walk away, comfortable in the knowledge that it'll get taken care of.

Having tools to make the routine parts of the job easier so that you can concentrate on the larger problem is a good thing.

--
Rob Gaddi, Highland Technology -- www.highlandtechnology.com 
Email address domain is currently out of order.  See above to fix.
Reply to
Rob Gaddi

...

Yes indeed. I don't see your statements as being contentious.

Does GC solve all problems? No. (But it does solve many)

Does GC have disadvantages? Yes. (But not as many as some people like to believe)

Are there situations where GC is contra-indicated Yes. (But that's boring since it is true of all technologies)

Reply to
Tom Gardner

Well, if nanny won't let you play with sharp objects, I guess it's hard to carve the beef. ;)

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs 
Principal Consultant 
 Click to see the full signature
Reply to
Phil Hobbs

I've never felt the need for GC in an embedded system myself, and I'm sufficiently averse to midnight phone calls that I don't use dynamic allocation there either.

There are probably situations where it's helpful, but I'm an instruments builder, and the MCU and memory are very rarely the cost driver. Field failures--even transient ones--are what costs money, not hardware. Having said that, I don't gold-plate the hardware either. It usually wouldn't matter to the customer, but one does have one's professional standards. ;)

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs 
Principal Consultant 
 Click to see the full signature
Reply to
Phil Hobbs

Sure, that's absolutely true, for big hardware as well as embedded, but we live in an era where "getting the best from the hardware" isn't an issue for most projects these days. It's been like that on desktops for decades, and it's now getting like that even for small embedded stuff.

Right now I'm building what amounts to a modified LED blinky. In the old days that would have been a transistor circuit or a 555 timer. Or it could straightforwardly be done with the smallest of PIC 10Fxxxx's or whatever. But instead I'm using the Gemma M0 board that I mentioned, a

32-bit ARM cpu programmed in Python, just because it's cheap enough and it seems like the easiest approach.

Sure if I was making millions of them I'd have to go down the cost reduction road, but I'm making just a handful. I'm sure the same is true of most things built by contributors here. Being able to get down to the machine level used to be the starting point of embedded programming, but now I'd consider it further down the path.

Reply to
Paul Rubin

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.