LPC900/80C51 Compiler Toolchain

No , I didn't

I bet it doesn't do it that well

Absolutely not. As many commercial companies have looked at both they know how far *behind* GCC and SDCC are. Particularly in the case of SDCC for 8051.

Yes. Which includes all the targets you have mentioned so far. Are you suggesting that Gcc and SDC are written for MCU that have no market?

Usually because the particular MCU has been discontinued.

Yes but compilers isn't one of those niches

--
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\
\/\/\/\/\ Chris Hills  Staffs  England     /\/\/\/\/
/\/\/ chris@phaedsys.org      www.phaedsys.org \/\/\
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/
Reply to
Chris Hills
Loading thread data ...

Yes it is! It's just that the business model of OSS is totally different. With proprietary tools, you usually purchase the compiler (cheap through expensive) and then pay a relatively low cost for support and annual upgrades. In the the open source world, you get the compiler free, but support is paid for at the going rate. Richard Stallman charges in dollars per minute, according to a page I once browsed.

Other models include the software rental model, which one of our most successful clients uses. This model gets continuous development money and permits "small and often" upgrades. Another mode, e.g. the one used by Rowley and others for ARM, is to take the gcc compiler and then to provide added value in terms of IDEs and libraries focussed on embedded systems - not to mention a real installer.

There are lots of software business models, and use of OSS in business is just based on one that the conventional commercial toolmakers do not use. Once you realise that all software developers have to earn a living, it's just a question of looking for the business model they use.

Stephen

--
Stephen Pelc, stephenXXX@mpeforth.com
MicroProcessor Engineering Ltd - More Real, Less Time
133 Hill Lane, Southampton SO15 5AF, England
tel: +44 (0)23 8063 1441, fax: +44 (0)23 8033 9691
web: http://www.mpeforth.com - free VFX Forth downloads
Reply to
Stephen Pelc

And you know this...how?

No offense intended, but you have a long posting history of claiming that Open Source Software is inferior to commercial software, and your website (

formatting link
) appears to be that of a vendor selling commercial software, and thus reading your opinions is a lot like asking a realtor whether this is a good time to buy a house. Do you have any references or examples that support the claim you made above? I am not saying you are wrong; I just want to test the claim myself as best I can.

(Disclaimer: I personally have no use for anything called a "Small Device C Compiler for 8051." I program 8051s in Assembly and Forth, as God intended us to do when he gave the 8051 design spec to Moses. I have nothing against those folks who like PDB8 assembly lang... er, I mean C, but I personally prefer Forth and Assembly.)

Reply to
Guy Macon

1 I have a copy of it here. 2 benchmarks (which can of course be used to (dis)/prove anything :-) 3 I know what it doesn't do that some of the commercial compilers do.

Yes I sell tools. Therefore my opinion is as good as any user of FOSS who promotes FOSS. Why is it assumed that any person who sells commercial SW is unduly biased but any one who promotes FOSS is not?

I have generally found the reverse to be more true. I have to work in Engineering reality . Not some utopian belief. I got cynical early.

However as over many years I have discussed in private with many commercial compiler developers, code analysis developers and seen the results of their tests I have been able to form a good picture. Most of what I have seen or been able to examine is under NDA.

The problem is the FOSS Sw is just that. Open. So any development is seen by all the commercial tool developers. However the reverse is not true. So anything FOSS has the commercial people also have and therefore FOSS can not be more advanced. At best it can only be equal.

On top of that the commercial developers get access to the silicon companies and some sophisticated development tools and test suites. How many GCC or SDCC compilers have been run against Plum-Hall or Perennial?

Test suites that are rigorous can cost more recourse to develop than the compiler itself.

There are techniques used in the commercial tools that are not used in Gcc or SDCC and the like. To maintain their advantage the commercial tool companies are not going to tell FOSS developers what these are. Though in some cases like DATA overlaying on the 8051 knowing what is done is of little help. There is a lot of effort involved in data overlaying and other forms of optimisation.

There is mention above that the SDCC handles the architecture of the

8051... WHICH architecture. There are several. Having worked for an ICE manufacturer I know there are over 40 different cores and I think about 10 different timing models spread over the 600 odd variants.

Just compare the variants SDCC supports compared to say the Keil PK51... Incidentally the KEil suite used, under the IDE, 2 different compilers and linkers to support the full range to 8051's

Some parts use more than one memory model. I recall the fun when Philips brought out the MX range... an 8051 with 256K contiguous linear memory. (Internal eXternal Data etc) this caused compiler companies a lot of..... opportunities. However they worked directly with, in this case, Philips for many months before the part was launched. Therefore they discuss the mechanisms inside the chip and how they will work in a way the FOSS developers can't.

SO add together what the commercial compilers get Full view of the SDCC and or Gcc compilers and how they work Full testing with Industry standard test suites Full co-operation of the silicon vendors. Resource for sophisticated development and test tools Addition all in house techniques A lot of in house knowledge.

The FOSS compilers get Nothing.

Recently some one challenged Byte craft on one of their claims saying it was not possible. They showed it was possible but did not explain exactly how they did it. The basic books on compiler theory take a lot of reading and understanding. The problem is the commercial tools companies have invested heavily in advancing the science. They just had not published. Only FOSS does that. So any advances FOSS makes the commercial guys get but not vice versa.

Then there are the customers. When you deal with the safety critical world you find the customer run their own tests. This also gives a good indication of what compilers can do. However again these are usually under NDA's

FOSS Devotees will have to get over it. ALL their stuff is open in a world where no one plays by those rules. It is like playing bridge where there is a team of 3 against 1 and you always have the disclosed hand.

--
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\
\/\/\/\/\ Chris Hills  Staffs  England     /\/\/\/\/
/\/\/ chris@phaedsys.org      www.phaedsys.org \/\/\
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/
Reply to
Chris Hills

... snip ...

That is not so, because most open-source is released under the GPL or an equivalent license. Copying such is outright theft. Using it, and releasing source, is not.

... snip ...

I suspect you are referring to my quizzing Andrew on a (mis)statement he made on compiler checking. He cleared that up. It was NOT an explanation of how he did it. Your claim above is a gross distortion.

--
 
 
 
                        cbfalconer at maineline dot net
Reply to
CBFalconer

Looking at the source you understand what is it is doing. You can then use the same idea in your own system... I didn't mean you literally copy the source. In any event Sw is not pantentable.

Andrew who?

OK then can some one PROVE how FOSS compilers are more advanced or *at least* as good as commercial ones?

--
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\
\/\/\/\/\ Chris Hills  Staffs  England     /\/\/\/\/
/\/\/ chris@phaedsys.org      www.phaedsys.org \/\/\
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/
Reply to
Chris Hills

Wanting to remain an observer of this debate - however I am curious......what is this advancement buying you? Considering the law of diminishing returns, what does last years advancement get me? If its an extra micro seconds per control loop, and this is important to me, then I chose the wrong processor - not the wrong compiler.

--
Regards,
Richard.

+ http://www.FreeRTOS.org
A free real time kernel for 8, 16 and 32bit systems.

+ http://www.SafeRTOS.com
An IEC 61508 certified real time kernel for safety related systems.
Reply to
FreeRTOS.org

You should tell the USPTO that. And the EPO, too, while at it. And the hundreds of developers who found themselves threatened with lawsuits over the LZW patent, too.

Software is explicitly declared non-patentable in the documents establishing the European Patent Office. Yet the GIF patent prevailed.

Not unless we all agree on a quantifiable measur of "advanced" or "good" first. Which, of course, is the core of the problem.

Proprietary tools cause certain risks to their users, primarily by tying your project's long-term survival to that of the tool vendor, possibly extending to a hostage-taking kind of situation. This is a risk that, once expressed as a finanial risk factor, may forbid using the tool.

It's practically impossible for GCC to ever stop working. Proprietary compilers can do that, some did, and some will do it in the future.

Reply to
Hans-Bernhard Bröker

In article , FreeRTOS.org writes

Yes much better compression and execution speed. I have seen several programs that the Keil 2/4/8k limited compilers could get to run on a 51 that the unlimited SDCC could not.

The problem is when they want to add "just one more feature" without changing the whole design. For example... smart cards and mobile SIMS and many other things. Especially when by law you need to add something to an old system or to change some IO because they are new sensors.

Ideally you would scrap the whole system and start again to ad a small change to an end of life product.

The other problem is that there are new extended 8051 family members. The ones with 8Mg adress space in both code and data areas. Internal (on chip) External data space. And all sorts of things in the new parts that the better commercial compilers will cope with the the FOSS ones don't

I do find it strange that people are arguing so strongly for using second rate tools in their profession. What would you think of a doctor, dentist, aeronautical engineer who argued the same?

--
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\
\/\/\/\/\ Chris Hills  Staffs  England     /\/\/\/\/
/\/\/ chris@phaedsys.org      www.phaedsys.org \/\/\
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/
Reply to
Chris Hills

I've not seen it yet for compilers but I have seen it for PCB CAD software.

Case 1 - Product (PCB CAD) had a HW dongle and refused to work when HW was updated. The product had been abandoned at that point (I seem to remember takeovers being involved). A fix for new HW/OS not available.

Case 2 - This one a compler. Abandoned by the original company. Thankful no HW or SW dongle so it could continue to be used through multiple OS and HW updates.

Unfortunately compiler companies have moved to SW dongles (particularly FlexLm (spit! Ack!)) so case 2 looks like an increasing doubtful precedent to rely on.

Bottom line micros with a GCC port will get used before micros w/o. If a micro has no open source compiler of reasonable (best not necessary) quality and no undongled commercial SW then it has to be a lot better/cheaper to be worth considering. Inexpensive ARM7s provide a large hurdle for any proprietary micro with no GCC port to clear.

Since I've seen compilers add dongling after purchase, even undongled compilers get second biling.

Robert

--
Posted via a free Usenet account from http://www.teranews.com
Reply to
Robert Adsett

Umm - how about the Keil ARM compiler - now defunct. I'm just being pedantic here, I don't actually see this as relevant. Switching from one compiler to another is usually a very simple exercise.

I don't have too much experience in the 8051 market, but am asked a lot by customers about ARM compilers. Once I go through the various pros and conns of each there is one thing that normally swings the decision, and this one thing is not actually a techincal attribute. Officially I'm compiler neutral so I'm not going to say what the one thing is :o)

I know of cases of this in the OS market. Could not comment on the compiler market.

--
Regards,
Richard.

+ http://www.FreeRTOS.org
A free real time kernel for 8, 16 and 32bit systems.

+ http://www.SafeRTOS.com
An IEC 61508 certified real time kernel for safety related systems.
Reply to
FreeRTOS.org

I would normally expect a doctor, dentist or aeronautical engineer to use "second rate" tools when appropriate. I expect a doctor to use a very different level of quality in the tools used for brain surgery and the tools used to sew up a small cut in a finger.

What people here are arguing for is not that we should prefer low quality tools, but that we should prefer *appropriate* quality tools. We need tools that are good enough for the job - beyond that, you can compare on things like price, availability, support, additional features, and whatever else interests you.

As Richard says, what do these advancements give you? If I have an 8051 with 8K Flash, and SDCC compiles my program to 4K, then what benefit is a Keil compiler that compiles it to 2.5K ? It's a different matter if the figures are 10K for SDCC and 7K for Keil. In the first case, SDCC is good enough, and therefore a perfectly viable choice - in the second case, it is *not* good enough.

Reply to
David Brown

Lower EMI and power consumption to name two benefits.

Walter..

Reply to
Walter Banks

In article , Walter Banks writes

Also because the program may still not run with the SDCC compiler Code size is only one component on a Harvard Architecture.

The usual problem with the SDCC is running out of DATA space long before the CODE space limit of the Keil compiler becomes a problem. I keep saying this. It is important.

So your SDCC compiler produces 4K of CODE but still can not get the Data to fit. What then?

If the Keil fits twice as much code into the space as the SDCC you will run out of CODE space very much faster using the SDCC. (Code ALWAYS expands:-) What then?

SDCC is OK if you are using a standard 8051 with a small program with no data, no power or EMI limitations that will not expand or need to be ported to another 8051. The program has to be simple... what are you going to debug it with? AFAIK SDCC has no simulator/debugger

Chris

--
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\
\/\/\/\/\ Chris Hills  Staffs  England     /\/\/\/\/
/\/\/ chris@phaedsys.org      www.phaedsys.org \/\/\
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/
Reply to
Chris Hills

Yes, I know those are other benefits of having faster programs, but that's irrelevant here. My point is that if SDCC (or some other free or low-cost tool) does a good enough job for what you need, then is there any reason for buying something more expensive and more advanced? The answer is, of course, no. It seems, in this thread, that Chris Hills is having a great deal of difficulty in understanding the concept of "good enough", and fails to understand why anybody would ever be happy with a tool that is not the absolute best available solution.

I *know* there are many advantages in a compiler that produces smaller and faster code - but often that is irrelevant. I *know* there are advantages in having a compiler that has been through all sorts of industry standard torture tests, but that too can be irrelevant. I

*know* there are advantages in a compiler that is easy to install, or easy to work with - but again, that may not matter (and it's a subjective issue anyway).

"The best is the enemy of the good" (Voltaire, if google got it right.)

Reply to
David Brown

I'm with you on this one. In 2006 the latest GCC compiler was not able to match the codesize or performance of an ARM compiler that was released in 1995. The only target GCC is good at is x86, partly because it gets a lot more attention than any other target, partly because most of the optimizations happen in hardware anyway.

I don't believe commercial companies look closely at GCC source code. Many companies have strict policies that stop people from even looking at open source code to avoid accidentally copying stuff. However any good compiler expert only needs to look at the generated code in order to "borrow" optimizations.

Full access to silicon/FPGA and specs is essential to provide a well tuned compiler indeed. I don't think that is a barrier to open source, there are many well respected open source companies that would be given access to these things (and I know that happens). I wouldn't call the commercial test suites very good, most compiler teams develop their own testsuites as none are good enough to test a compiler.

Quite possibly.

As I said above, one doesn't need to know what the techniques are, it's relatively easily to deduce from the generated code. Interestingly, open source compilers use more modern techniques than commercial compilers (which were created in the 1980's). However technology alone doesn't make a good compiler, you need to know how to use it. The main difference between a good compiler and a mediocre one is the amount of effort that has gone into fine tuning it.

And this is where I think commercial development has the advantage. Compiler vendors find and pay the best people to do whatever it takes to make the compiler as good as possible (with the idea that this large investment will pay itself off later). I don't see this kind of dedication in open source compilers, as developers usually don't get paid for their work and so don't attract the best people. The ARM port of GCC for example was neglected for years (with Thumb not working at all) until ARM paid for it to be fixed and brought up to date with the latest architectures.

That's said, I don't believe one or the other is inherently better. I would be happy to put a few years into turning GCC into a really good compiler that beats most commercial ones. However who is going to pay me my consulting rate? Big companies with their own compilers... I'm currently in Seattle :-)

Wilco

Reply to
wilco.dijkstra

This is non-sense. I think you'll find that most of the GCC developers are being paid to work on GCC.

Just look at the GCC steering committee:

Next, you may want to look at the list of contributers:

While GCC work *may* be done by anyone, serious development and maintenance of this cornerstone of Free Software is mostly done by paid skilled professionals, whose employers understand the value of the GCC.

Demand drives GCC just like demand drives the commercial compilers.

--
Michael N. Moran           (h) 770 516 7918
5009 Old Field Ct.         (c) 678 521 5460
Kennesaw, GA, USA 30144    http://mnmoran.org

"So often times it happens, that we live our lives in chains
  and we never even know we have the key."
The Eagles, "Already Gone"

The Beatles were wrong: 1 & 1 & 1 is 1
Reply to
Michael N. Moran

Thanks. How come if you and I (and Walter) can see it no one else can? This is my argument FOSS Devotees are blinded by their "religion"

Now the FOSS people seem to argue the opposite. They say commercial compilers that are closed source no one really cares about code standards because the outside world can't see it. But because the Open source can be seen by all far more care is taken....

However they brought it up to date and left it there... which is not the same as actively supporting it.

I know that several companies have put an Engineer on to GCC for 5-10 days "for fun" * to see what they could do with it and all managed to get huge increases in code density and speed. (Not on X86 though)

  • Actually in all cases it was to see how competitive Gcc *might* be if it was properly developed..

However much FOSS is effectively becoming commercial now anyway. The only difference is the core programmers don't get paid, yet the FOSS Devotees love it.

Turkeys voting for Christmas.

--
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\
\/\/\/\/\ Chris Hills  Staffs  England     /\/\/\/\/
/\/\/ chris@phaedsys.org      www.phaedsys.org \/\/\
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/
Reply to
Chris Hills

Like Linux. Gcc is now commercial

The problem is that the FOSS Devotees think that everyone who does any work on GCC is a Devotee. There are many cynical non-belivers who are not helping in order to help the faith..... they have other objectives and don't care if FOSS sinks or swims.

\/\/\/\/\ Chris Hills Staffs England /\/\/\/\/ /\/\/ snipped-for-privacy@phaedsys.org

formatting link
\/\/\ \/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/

Reply to
Chris Hills

I was under the impression - possibly mistakenly - that CodeSourcery were the official guardians of ARM GCC, amongst other ports. Cortex-M3 support has been added very recently.

--
Regards,
Richard.

+ http://www.FreeRTOS.org
A free real time kernel for 8, 16 and 32bit systems.

+ http://www.SafeRTOS.com
An IEC 61508 certified real time kernel for safety related systems.
Reply to
FreeRTOS.org

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.