LPC900/80C51 Compiler Toolchain

Many RISC architectures *do* support push and pop, through register indirect with automatic pre/post increment/decrement modes, though they are not too fussed about which register is called "SP". But pipeline stalls are likely if you have several "push" or "pop" instructions in a row.

That all depends on whether you want to have a separate return stack and data stack, or a combined stack. There are advantages in each - a data stack (indexed by Y) gives you (Y + index) addressing modes, but effectively locks the Y register and complicates stack arrangements since every task needs two individual stacks. A combined stack avoids these issues, but requires Y to be set up as a frame pointer for functions with data on the stack (not as much of a problem as it might seem, as the AVR has plenty of registers for local data).

No, "optimisation" is about getting the best possible code (faster, smaller, or whatever other definition of "best" you are using). Look up the word in a dictionary. Thus a good optimiser will use a frame pointer or not, a fixed frame size or a variable frame size, according to the combination that generates the best code for the task in hand.

Reply to
David Brown
Loading thread data ...

I have been using sdcc with lpc900 microcontrollers for a few months no and it works great! You can quickly try sdcc and see if it works for you Just get the latest snapshot and don't use the include files of comercia compilers (such as keil) as their definition of special function register (SFR) is different. Fortunatelly sdcc comes with an include file for th LPC932!

Reply to
sensei141

In article , sensei141 writes

Not compared to a commercial compiler.

Why would you use their (copyrighted) files anyway? Incidentally the Keil SFR names will be those used by the silicon company as Keil works with the silicon companies when they design the chips.

So it should..... how many of the other 600 odd 8051's does it support? Does it also support the other memory models other than the standard one?

BTW does SDCC do DATA overlaying? If not it is not really worth using on the 8051's

Then use the (FREE eval) Keil compiler

It's not compared to the commercial ones

--
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\
\/\/\/\/\ Chris Hills  Staffs  England     /\/\/\/\/
/\/\/ chris@phaedsys.org      www.phaedsys.org \/\/\
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/
Reply to
Chris Hills

Have been wary about getting into this, but it irritates me that the commercial vendors are quite happy to steal new ideas from open source at no cost to improve their product, but then have the chutzpah to suggest that gcc is inferior. Neither are they willing to disclose their own improvements that might contribute generally to the state of the art. Of course, in the commercial world ip matters, but probably most of the really good new ideas in software in the last decade or so have come out of open source or academia, so what have the commercial vendors got to offer that makes them so good ?. After seeing how Linux has bloated over the past few years, have had a few things to say about open source myself, but the gcc suite is one exception that disproves any general criticism.

It's no good saying they use advanced techniques without at least giving some idea of why I should pay $k's over something that works and that I can download and build for free. For example, I still use (old) gcc

2.7.2 for 68k work. An unpretentious compiler that i'm familiar with, produces acceptable code and still have to catch it out with even the most complex data structures and constructs etc. Have not found a show stopper bug in several years use, which is more than I can say about some commercial product. I think the commercial vendors are at a very serious disadvantage now. Tool development is such a labour intensive and skilled process, with shrinking market, that they have to charge high prices just to stay in business, never mind keep up with the state of the art. A highly unstable market, with thousands of devices and variants to support and guessing which will be successfull and sell volume. Not a business I would want to be in at all.

It's not just the compiler either - The gcc suite has an excellent set of binary utilities, linker, make and archiver / librarian etc, which blend together seemlessly in a unix environment to produce a really top class development suite. Again, miuch more than can be said for some of the commercial vendors. If you want a good laugh, have a look at the utilities associated with earlier versions of the IAR H8 compiler and the command line options. Even Keil C wasn't that brilliant in terms of added value and that's a compiler that i've used extensively and respect quite a lot, even if the code it produces looks horrendous at times.

So Chris, what's the unique selling point ?. Have failed to be convinced thus far...

Chris

--

----------------------
Greenfield Designs Ltd
Electronic and Embedded System Design
Oxford, England
(44) 1865 750 681
Reply to
ChrisQuayle

FUD of course. If you buy a $xxxx product, when the project crashes, you have your arse covered. If you recommended cutting costs, it will be siezed on by everyone else as an excuse for failure, and you will find yourself climbing the pyramid steps to face the jade knife.

It's a problem endemic in British industry. Every project gets gold plated, layer upon layer, so much so that a relatively simple north-to-south cross- London rail line renewal, costed at £625 million

10 years ago, is now the flagship of the government's new rail "policy" at £5.5 BILLION. In the end, everything is too expensive and nothing gets done.

Paul Burke

Reply to
Paul Burke

Many Linux *distributions* have bloated - Linux itself has not bloated (it's got bigger and better, but not bloated). There are plenty of open source projects that have suffered from "bloat", and plenty of distros that have grown huge, but the great thing about open source systems is that you mix and match what you need, and can get big, fancy, powerful software ("bloated" software) or small, lean and dedicated software.

(If you want to try a newer version, go to

formatting link
They maintain the gcc m68k port, and there have been a fair number of improvements over the years, especially for ColdFire parts.)

There are some situations where commercial developers have advantages over open source developers - it is often easier to get restricted information (advance information, or extra technical details) from the microcontroller manufacturers. Even for those manufacturers which directly support gcc ports, there can be restrictions with some PHB wanting to keep details secret, which therefore cannot end up in open source code.

There is also currently only one serious, powerful open source compiler system - gcc. So for architectures that don't fit gcc's models, there are currently no top-quality open source alternatives (sdcc is, as far as I understand it, a perfectly reasonable compiler - but it is not a top-ranking 8-bit compiler in the same way that gcc is for many 32-bit targets).

The other thing that commercial developers can do that open source developers typically do not, is spend time and money on everything

*around* the compiler - documentation, installation, IDEs, debuggers, dedicated support teams to help you out when the software license locking system has lost its keys, etc. These sorts of things take time and effort, and are often not the most interesting parts of the job - it's fine if you are paid to do it, but for many open source developers and their employers, margins are smaller and it's harder to find the time and money to spend on less essential parts.

As I see it, there is plenty of place for both commercial and open source development tools (and both high-price and "cheap and cheerful" price commercial tools) at the moment. What is interesting is the changes for newer architectures - steadily more manufacturers are going straight for a gcc port for newer 32-bit architectures, rather than the more traditional approach of working closely with a commercial developer. The development process for these ports is often not very open (they work behind closed doors, and release binaries and corresponding source files in lumps), making it somewhat of a middle road.

Many commercial toolsets are now using open source utilities, as well they should - for most parts not directly related to code generation, common open source versions of the tools are far better than anything anyone else makes. A good example is the Quartus design suite for Altera FPGAs - the compilation and routing parts are all closed source from Altera, but the glue that holds it together is cygwin, tcl, and perl - all open source versions. Their soft cpu is generated with closed-source programs, using tcl and perl, with Eclipse for the IDE and gcc for compilation.

Reply to
David Brown

This is true.... there is a lot if NDA type work that goes on for months before the release of parts. I have been involved in one or two.

The Silicon company was working with the compiler company before even the registers were fully named (or mapped out)

However it is not really suited to the smaller end of the market.

It is missing some important features that make it unsuitable for a lot of development

Yes. Absolutely

What license locking? ?You assume much. II sell several commercial compilers that have no locking on them at all. Most Sw these days has activation/locking.

So they only do the bits they like or make money on?

I agree.

This is incorrect. They do work with the commercial compiler vendors. However they usually do a port of gcc so there is a free tool so there is a low risk/cost entry point to trying out the new silicon. It is nothing to do with a belief in FOSS.

Absolutely. Also the gcc port is usually created but not supported. IT is just there as is so there is a free/low risk way of evaluating the new silicon.

Some

Why? Many used to make a nice side line income doing utilities.

This is not true... I have a customer who's use of a FOSS utility cost them over 10K GBP in time and effort sorting out the problem.

Seems a sensible mix but note Altera is a HW company... to them software is simply a marketing tool.

All the people who you mention who have "embraced" open source are HW companies who are selling HW and value Sw as a free give away.....

Open source values the programmer at Zero and I have seen nothing to change this view. In it act it is getting worse.

--
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\
\/\/\/\/\ Chris Hills  Staffs  England     /\/\/\/\/
/\/\/ chris@phaedsys.org      www.phaedsys.org \/\/\
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/
Reply to
Chris Hills

Put the world to rights indeed :-). If you want to have a rather offbeat take on all this and similar stuff, look at:

formatting link

and start with the Programmers Stone section. Apologies if everyone else has seen this already.

Other sections of the site are interesting as well - Brain dead, dopamine addicted society courtesy of government social engineering etc.

Deserves a wider audience imho, despite the completely off the wall parts and follows on from where Brooks ended...

Chris

--

----------------------
Greenfield Designs Ltd
Electronic and Embedded System Design
Oxford, England
(44) 1865 750 681
Reply to
ChrisQuayle

The reason for 2.7.2 was originally because it was a late enough revision for the 68k port to be stable and the whole suite was less cluttered by ports to other, later processors. Also, Tru64 unix wasn't supported as a target and some of the very late gcc and library versions wouldn't build at all, whereas earlier versions built fine. Really can't fault it though and have done some of my best ever work using those tools. Apart from the fact that the tools just work, a key advantage is that future projects using other processors will have a head start in terms of knowledge and code base. Now have a whole set of templates for make files, linker scripts, debugging, various home rolled utilities etc, that will transfer with only minor change and expense to a new target.

What gcc and open source in general have done is to make it possible for small shops. one man companies etc to have access to some of the best tools available and am gratefull for that. The leverage effect gcc has had on the ability to generate large, robust software systems and on the advancement of the computing art is hard to underestimate imho. Without a good compiler and infrastucture, there is no software and now, everyone can play :-).

There may be some advantages, but did't someone else say that arm put a lot of cash into gcc development to make it work properly ?. If so, one would assume that there was quite a lot of input from arm so that key processor attributes were properly catered for.

The way I see it, the current producer / consumer relationship for commercial tools is one of mistrust, a pita. All the flexlm and dongle crap is basically telling you that they don't trust you and they think you will steal the software to run on several machines concurrently. No mainstream pc software vendor would get away with that now, so why should we put up with it for embedded tools ?. We bought Keil C some years ago for a project and (the client) paid extra for an undongled version, not so we could steal it, but so that it could be installed at client and development sites. I was the only developer, so only a single copy would ever be used at once. This I would consider fair use within the license terms, though i'm sure some vendors would disagree.

I dunno, overall, commercial compilers will enventually die in the

16/32/64 bit arena without a revised business model. 8 bit market perhaps, but there's starting to be quite a bit of open source activity there as well...

Chris

--

----------------------
Greenfield Designs Ltd
Electronic and Embedded System Design
Oxford, England
(44) 1865 750 681
Reply to
ChrisQuayle

Sometimes open source programmers can get access to this sort of information, sometimes not (for example, I'm sure that the Atmel employees involved in avr-gcc development have access to non-public information, and I know that a couple of msp430-gcc developers have NDAs for access to debugger details, which are implemented in a small closed-source program). There are also situations where the open source development is at least partly behind closed doors (with the source released when it is finished) - development companies with this approach have little problem getting advance information. But in general, it can be easier to get secret information if you stick to closed source development.

Correct - it is a better match for bigger devices (although the avr-gcc port is solid despite a few mis-matches with gcc's expectations of a processor, and the msp430 port is fine).

I don't know enough details to comment here - I've never actually used an 8051.

Most software these days does not have locking or activation. Many types of software have some sort of registration or require a license key or number during installation, but few have any sort of locking (by which I mean some sort of check or validation before each run). The tools used in embedded development (both for software and hardware development) are unusual in this respect, in that locking of some sort is very common for commercial tools.

Any serious embedded developer can tell you horror stories of fights with licenses, ranging from broken hardware dongles, lost licenses after hard disk crashes or changing network cards, confusions over licensing policies resulting in waste time and money, long waits for license codes, issues when transferring the software to another computer, and other such problems. Some commercial software suppliers do a good job of minimising these issues, others can be a real pain, and the inconvenience is part of the price you have to pay for commercial software.

It depends on who they are, and how they work. A volunteer coder working in his free time is likely to work on things that interest him. Others might contribute changes that improve the software for their own use while being unable to justify the time or money needed to make changes that don't benefit them or their employer. But there are other developers who are paid to work on open source projects - they will work on whatever their employers tell them to work on, just like any other paid programmer.

There are many different models for making money out of open source development, with some similarities and some differences from models for making money out of commercial software. That leads to different priorities for what people spend their time and money on.

I did not say it was anything to do with "belief in FOSS". As an example, the AVR32 is supported by a gcc port written by or for Atmel. I would be very surprised if this is because Atmel has altruistic beliefs about open source - it is a purely pragmatic economic decision. I presume (I have no insider information - this is all guesswork) they knew their chip was useless without a high quality compiler, so they looked at how they could get such a compiler quickly and cheaply (for them), providing a compiler that is solid and cheap for developers, and which would be a good match for the sorts of software they expect to be used on the devices. The answer is a gcc port.

That of course does not mean that Atmel did not work with commercial vendors as well - IAR apparently makes an AVR32 compiler, although Atmel's website does not mention it under "Tools and Software". Clearly it is in Atmel's interest for there to be alternative toolsets. But there is little doubt that gcc is the "official" toolset for the AVR32 in Atmel's eyes, not just a cheapo alternative for hobby users.

Ask Atmel (who employee some key avr-gcc developers, and who continually develop the AVR32 compiler, including ready-to-install versions for several Linux distributions), or Xilinx or Altera (who continually develop their gcc ports for their soft processors) whether they view it as a cheap way for customers to evaluate their processors.

Maybe they used to, but why would anyone want to pay extra for a "make" utility? Why would any commercial developer want to pay its staff to write an editor, or a scripting language, when there are perfectly good ones available freely, written by experts at that kind of programming (which is very different from the area of expertise of the compiler writers)?

Perhaps I should have added a "generally" or "usually" in there somewhere - there is always going to be someone somewhere that has problems. I had a customer who wasted over 10K GBP as a result of stupid licensing complications with a commercial embedded development toolkit - I don't expect you to suddenly think open source tools are always the better choice as a result of that single sample point.

Altera (and Xilinx, and others) spend more money on software development than hardware development. Software is the key to FPGA development - squeeze 10% more out of the chips by an improved placement algorithm, and you've gained 10% over your competition for *all* your customers, and *all* your devices, at zero cost per part. So Altera spends a huge amount of time and money getting its programmers to do what they do best

- making software for FPGA development. It does not waste any time or money on making "glue" software to hold it together - they use what is freely available.

I was specifically giving examples of hardware companies which develop and support open source software.

There are some companies that make money directly from selling nothing but open source software, but most see open source as a way to make money in some other way - selling support, selling other software alongside the open source software, selling hardware, selling subscriptions for early-release versions of the software, selling pre-packaged software, and that sort of thing.

The programmers who develop open source software for a living, along with their employers, would be surprised to hear that.

There are certainly some areas where commercial software is becoming less competitive in the face of open source, and therefore some programmers will be out of work or have to change the way they make their living. But it's just like any other sort of competition - for the most part, the end-users get more choices and lower prices, while top-dog suppliers are no longer able to charge such huge margins. And just like any other sort of competition, it has its risks and casualties

- too vicious competition can lead to a drop in quality due to price wars, and the middle-men often suffer or are made obsolete - that may be a good thing or a bad thing (obviously for you, as a reseller of commercial software, it's a bad thing - for end users, it can be good due to lower prices, or bad due to the loss of useful resources).

Reply to
David Brown

There are certainly differences between versions - as the compiler (and the C language) have developed, you'll always get situations where code doesn't re-compile cleanly. That's why it's seldom a good idea to change the toolset for working projects.

I too see that as a major benefit of using gcc - I use it on a half dozen different architectures.

Cost of the tools can certainly be an issue for many users. Others will tell you that the cost of even top range commercial tools is insignificant compared to the cost of the programmer, but there are certainly exceptions. In your case, being a small one-man shop, the outlay for top price tools would be a large chunk of your budget. In my case (we are small, but not that small), a big issue is that I use a fair number of different architectures - if I only spend 10% of my time working with a particular device, then the tools are effectively 10 times as expense when amortized over time. The similarity between gcc ports and thus lower learning curve is an additional benefit (I know some of the large commercial developers support many architectures).

Some commercial vendors I have come across do see things in this light - I've had some that have happily emailed me an extra license so that I could run the software on my home PC as well as the office one. These sort of vendors assume you are honest - the software will work fine for a certain time without any licenses, or after license problems, giving you plenty of time to talk to them and buy a license or fix the problem. But there are others that, as you say, start with the assumption that all their customers are potential thieves - not a good way to start a business relationship.

I don't know that commercial 32-bit compiler development will die, but it will certainly change (and has been changing). It will certainly get harder to write competitive tools for new architectures - maintaining tools for existing architectures is much less demanding.

Reply to
David Brown

With countless of us programmers being paid handsomely to work with open source software, this is simply an admission that you don't understand the open source business models. The folks who write our pay cheques clearly value open source programming well above zero.

Steve

Reply to
steve_schefter

I think that most gcc development is by paid staff. (And the linux kernel too, probably). And many vendors give away limited versions of tools. So I am not sure what Chris' point is here.

--

John Devereux
Reply to
John Devereux

I think that is why companies like Codesourcery charge so much for a comercial license for gcc. One gets support, and a binary that has been checked etc. , but IMO the high price is to stop people automatically pointing fingers at gcc if something goes wrong. If one had payed big bucks, then obviously it must be good. If one had downloaded the code for free, then obviously it is suspect.

[Snipped]

Anton Erasmus

Reply to
Anton Erasmus

Bingo. It's certainly led me to strongly prefer architectures with freely available open source tools.

Commercial tools have a real hurdle to overcome, being better isn't good enough. The minimum requirement is significantly better. Good enough is good enough generally.

MS often appears to be headed in that direction with their quality asurance checks.

Even of SW they no longer actively sell.

Robert

--
Posted via a free Usenet account from http://www.teranews.com
Reply to
Robert Adsett

In article , Robert Adsett writes

Why not move the dongle between the two computers. As you were the only user I can't see the problem.

--
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\
\/\/\/\/\ Chris Hills  Staffs  England     /\/\/\/\/
/\/\/ chris@phaedsys.org      www.phaedsys.org \/\/\
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/
Reply to
Chris Hills

... snip ...

It seems that making a policy of never buying anything dongle protected is going to be much more reliable. When did you last move a dongle from an XT parallel printer port to a modern Dell, for example?

--
"Vista is finally secure from hacking. No one is going to 'hack'
 the product activation and try and steal the o/s. Anyone smart
 enough to do so is also smart enough not to want to bother."
Reply to
CBFalconer

Chris, it's not just the problems with dongles and flexlm etc that irritates. It's the business relationship where i am being asked to shell out thousands and put my trust in the vendor to provide timely and accurate support, while at the same time, the dongles etc tell me that they don't trust me. It's for this reason that I refuse to buy into it, even if it means using a different processor and toolchain. After all, there's plenty of choice out there these days.

That's quite apart from the dongle and flexlm hassles. If your machine or flexlm server crashes, it can take a lot of time to get everything working again, with obvious impact on timescales if a team of 6 is sitting idle. Have had this experience while working on more than one client site, where flexlm files have become confused for whatever reason and the tool vendor having very bad attitude when you call them to resolve the problems. The clients were not one man outfits either, but major names in the networking and telco fields. Who do these people think they are ?, never use them again etc, was the general reaction in the dev team..

The ideal situation would be no dongles, where the tool vendor really does offer added value, whether from a support or added functionality point of view. Share their improvements whith the wider software community to improve the state of the art generally and get a lot kudos, more sales and business goodwill in the process. Sure, you will get a few people making copies, but they wouldn't have bought the license anyway, so you could argue that it has near zero impact on revenue.

So, there's a business ethics issue here - modes of the past and there's no excuse for it. It's strange in life how the people who always make the most noise about being ripped off tend to be the most dishonest themselves.

Chris

--

----------------------
Greenfield Designs Ltd
Electronic and Embedded System Design
Oxford, England
(44) 1865 750 681
Reply to
ChrisQuayle

The proof is there that without a dongle their software will be stolen. As it is there are lots of people using cracked software illegally.

How would you feel if, because some other people were stealing what your employer produces he says he will have to cut your wages?

I am not a fan of FlexLm

And honest users..... You can't have one without the other.

Why? That is a completely naive scenario that does not work anywhere in business.

This is NOT TRUE. The main commercial tools have a lot of IP that cost them a lot to develop they are not going to give that away.

What you are suggesting will have a MAJOR impact on revenue and you end up with a lot of mediocre tools

This is not true either.....

--
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\
\/\/\/\/\ Chris Hills  Staffs  England     /\/\/\/\/
/\/\/ chris@phaedsys.org      www.phaedsys.org \/\/\
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/
Reply to
Chris Hills

Perhaps it's not possible? I know with dongle schemes like FlexLm that's the case.

I agree it shouldn't really be a problem but that doesn't appear to be the attitude of many merchants in the embedded tool areana.

Robert

--
Posted via a free Usenet account from http://www.teranews.com
Reply to
Robert Adsett

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.