Well, I use Cygnal - so I guess am just lucky then :-)
Somehow
Ok, but do you really enjoy writing all that code? I can live with some errors that are not my fault from time to time, but I am certainly not going back to ASM because that would delay all my schedules.
I think the point is to weigh off the time used to debug compiler originated C language errors versus the added time writing ASM code. I think I know the answer, but you may well disagree...
I can say that I know no-one who writes in ASM anymore except for _very_ time critical routines such as interrupts. Moreover ASM is platform dependant. What are you going to do if for example your manager decides to switch from say 8051 to AVR? I'm just going to rewrite a couple of headers and re-fit some details. You will have to start from scratch, wouldn't you?
Well, I use Keil and that one is ANSI C compliant. And if you need to switch then choose a compiler that too is ANSI C compliant or close to that effect
"Klaus Vestergaard Kragelund" wrote in news:3ffaaf20$0$9719$ snipped-for-privacy@dread14.news.tele.dk:
Keil C51 has a number of 8051 specific extensions, which will require rework when migrating to a different processor.
--
On Erollisi Marr in
Elder Graeme Faelban, Barbarian Prophet of 65 seasons
Tainniel Fleabane, Halfling Warrior of 31 seasons
Giluven, Wood Elf Druid of 26 seasons
Graeniel, High Elf Enchanter of 25 seasons
I'm aware of that, but isn't it still better than rewriting all the ASM code? If you write your C code in a strutual manor, you need to change very little...
How amusing. C, like any other development tool, is just that - a tool. Choosing the right tool for the job is obviously key, and obviously somewhat subjective.
Re optimiser: yeah, yeah (yawn). That's part of the process of choosing the right tool for the job. I routinely read the assembler generated by any compiler. I've frequently made changes to optimisation as a consequence. However, with say the H8 family (less so the 8051) I find that the available compilers do a pretty damn fine job.
As to "larger platforms" - huh? That's just silly. Tell that to the guys I know who use C on the PIC family ;).
Various answers: - Using third-party *anything* leaves you open to dangers. I've had some nasty experiences with 3rd-party libraries. I've occasionally had to work around toolchain misfeatures. But I basically don't proceed until I do trust the tools. - Haven't used a disassembler in >20 years. Most compilers will compile to assembler. You're clearly not so familiar with such tools. - Haven't used an ICE in >15 years. I write defensively and validate rather than debug. I don't "do" bugs.
Your mileage may, and obviously does, vary. If the interpreted approach works for you, lovely. But don't diss the rest of us - I typically work on mission-critical high-reliability high-volume embedded apps, and after
25-odd years of doing so, primarily on small micros, I've looked at and used
*many* approaches. C works for me, on *any* platform.
I'm not the one you asked, but I didn't see where anyone really answered this question...
C is totally, completely, and utterly usuited for the 8051. Or any other 8-bit micro. And vice-versa. (Well, the AVR isn't too bad, but that's a special case: it was developed with the help of a compiler vendor specifically to work nicely with C.)
1) C assumes 16 (or more) bit calculations are efficient. The "int" type is supposed to have the width "natural" to the processor, but must be at least 16 bits wide. There is _no_ way in C to to write an uncast expression containing at least one operator whose result is either {[un]signed] short or [[un]signed] char.
2) C assumes minimum resources are available. For example, there must be a minimum of 511 identifiers allowed within a block scope. And 127 parameters to a function. And 4095 characters in a string literal. and 1023 members in a structure. Try that with 128 bytes of data RAM.
3) Functions must be able to be called recursively, including main.
4) C has no concept of separate memory spaces, let alone separate RAM spaces (data, idata, xdata). This is bad enough on a von Neumann architecture, but on a Harvard micro (like the 8051) it's devastating.
This is off the top of my head. I'm sure there are others.
The 8051 seems to have been fiendishly designed to make programming in C difficult, inefficient or just impossible. Some of the worst problems are the lack of stack space, access of xdata through dptr only, and of course, the Harvard architecture. The only chip that seems to be worse (disregarding 4-bitters) is the 12- and 14-bit PIC cores (I haven't looked closely at the 16-bit PIC core or the dsPIC yet).
OTOH, Keil (and some of the other vendors and perhaps SDCC as well) has generally done an excellent job with its NQC (Not Quite C) development systems. They're very effective and easy to use, and make efficient use of the 8051 resources.
You're missing an important point, there. Just because Keil *has* those extensions doesn't mean you have to use them all over the place, thus making the code unportable. There'll obviously be some parts of a project where that will be unavoidable, but those would be the ones that 100% platform-specific by definition, regardless of how you do them.
And even where you do use those extensions, you can still hide them nicely behind some macros, so all it takes to render the code entirely portable is to change the definition of those.
--
Hans-Bernhard Broeker (broeker@physik.rwth-aachen.de)
Even if all the snow were burnt, ashes would remain.
Thanks for adding some more recent experience, Dave, mine is woefully out of date and I'd forgotten a few of those glitches. C was written for Unix, not for 8-bit micros. Although time has extended C to other venues and added new features it remains unsuited for the 8051.
One point, I know you were joking but the 8051 was first designed about
1975 with no thought to C. I think the languages of the day were first Assembler (duh), PLI, then Basic.
-- Regards, Albert
---------------------------------------------------------------------- AM Research, Inc. The Embedded Systems Experts
Modern compiler technology overcomes the majority of the objections to C running on 8 Bit micro's and other small processors or processors with unusual architectures.
It is unfortunate that many of the C compilers that are currently offered for embedded systems were re-targeted from public doma>
C99 brought forward size specific declarations. The As-If implementation rules allow very efficient expression implementations (including 8 bit) on small embedded system processors
These are translation limits for the compiler. There is no place in the standards that I know of that defines hardware limits.
There is nothing in embedded micro's that prevents recursive calls. Probably what is more important is there is nothing in the standard that prevents compilers from optimizing out recursive calls if they are not needed. In the same way there is no reason that every function be called in the same way.
ISO since the release of C99 focussed on C standards for embedded systems to address the specific issues of using C to program embedded systems. It defines support for multiple address spaces, register access, fixed point data types, asymmetrical I/O devices. The result of this work is a Technical Report (ISO/IEC WDTR 18037 Programming languages, their environments and system software interfaces ? Extensions for the programming language C to support embedded processors) that is expected to become part of the language in future language standard releases.
There are many extension in Keil C51 which are not the same in other
8051 C compilers. you will have to do some porting between ANY 8051 compiler let alone 8051 and another target.. However it is MUCH easier than porting assembler and C is much faster to write and test.
/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\ \/\/\/\/\ Chris Hills Staffs England /\/\/\/\/\ /\/\/ snipped-for-privacy@phaedsys.org
In 1975 nearly all embedded programming was being done in assembly language, with FORTRAN and PL/M to follow. There were some fairly compact BASIC interpreters; but their performance was generally 'poor' to 'intolerable' for time-critical applications.
I used C on (first) I8080 and (later) Z80 CPUs and had no serious problems - though I was constrained to integer math. Even the primitive 8080 could be programmed to produce 32-bit (or larger) results when needed.
I've never needed to produce code for an 8051 (nearly all of my more recent work has been on 16- and 32-bit SOCs) but I'm amazed to hear that the 8051 has less capability that Intel's old I8080 or Zilog's Z80!
--
Morris Dovey
West Des Moines, Iowa USA
C links at http://www.iedu.com/c
Read my lips: The apple doesn't fall far from the tree.
Why? The 8051 (and it's predecessor the 8048) was aimed at much smaller, simpler applications than the 8080/Z80 family.
IIRC, the 8048/8051 were built for Ford. They were intended for very small, simple automotive control applications with a few KB of code and maybe 50-100 bytes of RAM. The 8080/Z80 was a much more "powerful" general-pupose CPU with a stack pointer,
16-bit operations, and everything.
--
Grant Edwards grante Yow! What I want to find
at out is -- do parrots know
visi.com much about Astro-Turf?
Thanks. I'd never really thought of the 8080 or Z80 as being "powerful" engines - and I can't help but grin at your "and everything" comment.
I like C (and consider it to be one of the most powerful development tools ever) but have to admit that for control applications maxing out at few KB of code and 100 bytes of RAM, I probably wouldn't consider anything other than Verilog or assembly language solutions. -- Morris Dovey West Des Moines, Iowa USA C links at
formatting link
Read my lips: The apple doesn't fall far from the tree.
ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here.
All logos and trade names are the property of their respective owners.