I wonder why some developers choose non-free compilers (Keil, IAR, Cosmic, Raisonance, etc) when targeting architectures supported by the free Small Device C Compiler (SDCC). Answears that also mention the architecture besides the reasons would be particularly helpful.
To develop a non trivial program you not only need a compiler but also a good debugger or simulator that allows for testing any part of the microcontroller. Most free simulators are either textbased (gdb type) or support only a few parts and a subset of the controllers peripherals. A further reason for choosing a non free toolchain might be support for older designs created before SDCC was an alternative.
Dipl.-Inform(FH) Peter Heitzer, email@example.com
I rarely use microcontrollers that work with SDCC, but others at the same company have. There are a few reasons, I think, that lead to SDCC not being chosen despite its obvious advantages (cost being the main one). These are not in order, and I don't know how relevant they are in the grand scheme of things.
Manufacturers often recommend Keil or IAR, rarely SDCC.
Demo code is often for Keil or IAR. With these kinds of devices, there is invariably non-standard code or extensions so code can't easily be ported between compilers.
Pre-written code - either within the company, or from outside - is hard to port to SDCC. You usually have to stick with the previous tools.
New developers get familiar with Keil or IAR from university.
There is a perception (I can't say if it is fair or not) that SDCC gives less efficient results than the big name compilers.
My experience with big commercial toolchains is that this does not always help - often the support people have very little technical experience or knowledge. Maybe I just don't ask stupid enough questions. But I have heard (reliably) of toolchain support departments where the people dedicated to helping with dongles and software license locking problems outnumber the technical toolchain support staff by a factor of 3. Of course there are exceptions - some big name toolchains have excellent support staff.
My experience with open source toolchains is that you don't have a number to call, but you can find good help fast from forums, mailing lists, etc. And you can quickly get in contact with people involved in the development of the toolchains, not just a support monkey that won't listen to your question until you have installed all the Windows service packs and rebooted your PC.
I don't know about SDCC, but for gcc there are several ways to get commercial support - including from companies heavily involved in the development of the toolchains. Since you are paying for the support, not the software, it always has to be good quality.
But none of that contradicts the fact that "there is someone on the phone to help and/or yell at" being a significant reason for people to choose big name commercial toolchains over free and open source solutions.
As pointed out below, "ansering your phone call" and "fixing your problem" are two very different things. I don't care about the former. I do care about the latter.
Same here. Over the decades, my experiences with getting questions answered and bugs fixed have been far, far better with open-source tools than with commercial tools. However, that won't stop the anti-open-source people from using "there's no tech support phone number" as a reason to ignore open source tools.
It is indeed a popular reason. It's just not a good reason.
My experience is different, though it wasn't with Keil et al. For instance, one time long ago, I found a fairly horrible linker bug in Microchip C17--it was loading segments misaligned by one byte IIRC. I sent in a support email at lunchtime, and the debugged linker executable was in my email the following morning.
Of course I've had the same sort of thing happen with open source, e.g. the late lamented ZeroBugs debugger for Linux, written by the estimable Christian Vlasceanu. Nice piece of code, that, but he ran out of gas and/or money and got a day job. A pity--it was very nearly as good as the Visualage C++ 3.08 debugger (of song and story).
I expect that the distinction is mainly the size of the teams and the user bases.
I haven't used Keil or IAR, but comparing GCC to SDCC, it seems to me that SDCC is a more primitive compiler. There's a bunch of features absent and the error diagnostics were often unhelpful. I've used SDCC twice. Once was starting a small project from scratch, which wasn't too bad. I just dealt with issues as they arose. The other was attempting to port a midsized project (around 5K SLOC) from GCC to SDCC. It became clear pretty quickly that getting an SDCC version working at all would be considerable effort and the resulting binary probably wouldn't fit on the target cpus I was thinking of.
I'm not at all trying to bag on SDCC since it is obviously useful, but I can understand why people sometimes look for more featureful compilers.
Finally, although both programs mentioned were written in C, GCC can also compile C++, which has some attractions. I don't know if IAR or Keil compile C++. I also remember thinking that it would be interesting to write embedded applications in Ada, and GCC compiles that too. Right now I think there are no non-GCC compilers for Ada-2012 or later.
Both Keil and IAR support both C and C++, according to their webpages. But perhaps you have to pay separately for a C compiler and for a C++ compiler, and probably separately per target architecture, too.
There are some:
The GNAT Ada compiler from AdaCore, which initially used the GCC back-end, will have a variant based on the LLVM back-end. Currently this is still experimental, I believe. This compiler is the most up to date in terms of language features.
ObjectAda from PTC is an Ada 2012 compiler, but I believe it supports only x86 and x86_64 platforms, so not comparable to SDCC.
Janus/Ada from RR Software "supports the complete syntax and selected features of" Ada 2012. However, it lacks a few Ada 95 features, and is currently only available on and for x86 Windows. In the past, it supported also embedded targets such as the Z80.
There may be others; I have not made a thorough survey.
One key point here is that both IAR and Keil have toolchains targeting "big" processors, such as ARM. These are more advanced toolchains, and support C++ (I don't know what standard versions - but I'd be surprised if they were fully up-to-date).
So when comparing features, SDCC features should be compared to those of Keil or IAR for the same target - and I doubt if anyone is using much C++ for the 8051 or 68HC05 processors.
As for Ada, the only "big name" commercial toolchain I know of for Ada is Green Hills, and it is only for ARM, PowerPC, and perhaps a few other
32-bit devices. There is GCC Ada for the 8-bit AVR, though I believe the run-time and library are somewhat incomplete.
There is no doubt that GCC is a vastly more feature-filled project than SDCC. It is a world apart in terms of the languages it supports, the standards it follows, the static error checking, the diagnostics, the optimisations, etc. But while they are both free (and open) compilers, the targets they support are very different.
Until Microchip bought the company, I think basically any substantive question about Hitech C went directly to the founder (whose interesting name I can't recall). There were based at 45 Colebard Street West Acacia Ridge QLD 4110, next to the Archerfield airport in the south of Brisbane.
There's something about small teams that ensures high-quality responses
- if you can get a response. Perhaps the opposites are true for large teams.
I generally dislike proprietary tools, but back in the day, say for 8051 series, the architecture was so dire that it was hard work to program anything other than trivial projects in assembler. Any vendor that could deliver a reasonably functional C complier and ice adapter was on to a winner. Also, for 8051 series, the Keil toolchain had support for code banking, an essential for the limited address space. Just had to hold nose at the code produced, never look at it, but it did at least work. Later 8051 series from Silicon Labs and free toolchain were actually pretty good, but again, just don't look at the asm output.
Now, we have luxury of options, ide's and debug options, but still prefer the simplicity of a Makefile environment, with gdb for the odd times where debug by module testing and inspection isn't enough...
C17 was actually their previous in-house effort, which was so buggy that they bought Hitech and then quietly killed off their own compiler.
I bit the bullet and ported the code over to Hitech a few months later. The C17 series never sold well, I don't think, but you can still get the PIC17C456, twenty-odd years later. Microchip really rocks if you want product longevity.
Wasn't Hitech the small company (that 1 guy?) from Australia or NZ ??
I remember around 2007 +/- year or two, when there was a big Microchip conference up here in the Seattle-Everett area, he came by my work at the time and sitting there in our lab, made some changes to his compiler or was helping us with some issue. This was before Hitech was sold of course. VERY good person and I can't remember his name either.
Hitech (sp ?) here in the uk were agents for Keil compilers, including the 8051 8 bit series. We used that for a project around 1999 and it produced consistently working code. Something like 6 x 32 K banks and we never found a serous compiler bug.
You would not want to look at the asm source from it though, typically pages of impenetrable asm just for a hello world...
I have never used Keil's 8051 compiler myself, but I did once help someone who was using it. It turned out to be a compiler issue - the compiler was not correctly handling integer promotion rules. I don't know if it was a bug as such, or an intentional non-conformance aimed at giving users more useful code generation. (After all, the integer promotion rules are often a PITA for 8-bit devices - on a device like the 8051, when 8-bit arithmetic is all you need for a calculation, using
16-bit can take 5 to 10 times as long.)
I've known a number of commercial compilers for embedded systems that break the normal working of the C language in order to give more efficient results or simpler coding for users. That's not necessarily a bad thing - compilers don't have to be conforming - but it's a serious pain when it is the default behaviour and the documentation is poor.
Examples of this include skipping the zeroing of implicitly initialised statically allocated data (i.e., the ".bss" segment) in the name of faster startup, and abusing "const" to mean "put this in the flash address space rather than the ram address space".
Inertia and "expectations" of support -- can I call someone, today, and get my problem serviced (cuz I can't sit on my hands waiting for someone to "make spare time" to address my needs).
Note that embedded devices differ from desktop applications in that there are often hooks to hardware, interfaces to ASM "helpers", etc. A company may have developed a set of these from other products and wants to just "drop them in" -- without worrying about keywords, pragmas, calling/return conventions, crt0.s, etc.
Finally, one often needs/wants a debugger that "knows about" the rest of the toolchain and any quirks it may have. E.g., I typically hook the "debugger console" with a DEBUG() macro in my code. So, I can see messages like: Task05: Starting with arguments '123' and 'hello bob' Task09: Waiting for memory Task02: Opening output device 'tty03' Task01: Waiting for user input Task05: Initialization complete without having to watch a "memory buffer"
(and not have to reinvent these mechanisms for the next project!)
For 8051, Keil seems to generate better code than SDCC - I am currently doing some work on an old TI CC2511 (8051-core) chip, and tend to run into data size issues because SDCC statically allocates variables for all function parameters - Keil does have better optimizations for that (and probably also a better code generator, but I don't have much experience with keil).
Also, at work, we have used IAR because TI only supplied binary libraries for the CC2541 for that compiler (we had to get the correct compiler version, the latest-and-greatest would not do).
If I can choose the chip, I tend to choose something that has working gcc support if possible.
Thanks for all the replies, here and elsewhere. Since by now, further ones are arriving very slowly only, I'd like to give a quick summary.
I'll quote just one reply in full, since in just a few lines it illustrates the main points:
"In my case the customer requested SDCC based project but it failed to compile into the small flash size. Debugging was quite difficult. Using the Simplicity Studio and Keil Compiler pairing made the code small enough to fit into the device and made debugging much easier."
The 3 most-cited reasons to not use SDCC were:
Lack of efficiency of the code generated by SDCC.
Better debug support and integration in non-free toolchains.
Availability of paid support for non-free compilers.
In my opinion, the best way forward from here to make SDCC more competitive vs. non-free compilers is:
0) Improve machine-independent optimizations
1) Improve machine-dependent optimizations for mcs51
2) Improve debug support and integration
3) Find and fix bugs
I'd estimate the total effort at a full-time position for slightly more than a year, though even less effort should allow some improvements.