IDE for Atmel ARM processor

Which environment would people here recommend? What sort of price would I expect to have to pay?

Do they all require a monitor in external Flash or can they use the on chip ROM modules?

Many thanks in advance.

Reply to
Fred
Loading thread data ...

If you use the GCC (GNU) compiler, you don't have to pay anything. ARM, and various other companies have their own compilers that cost from $500 to $5k.

If you do go with GNU, Eclipse is a free popular IDE.

Check this article for some info:

formatting link

There is a more generic PDF written by the same guy (Lynch) but I can't find it at the moment.

H.

Reply to
Hw

In article , Hw writes

Not all ARM compilers are equal.

You need to look at: code density, Speed of execution targets and hosts supported Other tools that work with it. Support from vendor (et al)

The headline cost of the purchase price is not all there is to it. In some cases Linux is the most expensive option!

It is like buying automobiles. Some one my give you a Ferrari but can you afford to run or insure it?

--
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\
\/\/\/\/\ Chris Hills  Staffs  England     /\/\/\/\/
/\/\/ chris@phaedsys.org      www.phaedsys.org \/\/\
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/
Reply to
Chris Hills
[...]

Bad analogy, IMHO, particularly the bit about insurance. A better analogy might be purchasing a Ferrari vs. fixing up your old Chevy.

The Ferrari has higher initial cost but higher performance. Ferrari will give you a (limited) warranty, but that will run out after some amount of time, so you'll need to cough up more money later for an extended warranty (support). And if something does break, there are only a limited number of mechanics who are qualified to fix it.

The Chevy won't go as fast or handle as well, but it will be good enough most most uses (the speed limit on the freeway is still 70 MPH, no matter what vehicle you drive). It'll get most people where they need to go. It might break down once in a while, maybe even more often than the Ferrari (or maybe not), but finding a mechanic who can fix it is easier -- shoot, you might be able to do it yourself, if you're feeling plucky.

Some people need a Ferrari. Or believe they do. If you do need it, the Chevy ain't gonna cut it. But if you don't require a Ferrari, the Chevy beats walking.

Regards,

-=Dave

--
Change is inevitable, progress is not.
Reply to
Dave Hansen

Girls prefer a ferrari if that's a requirement;)

Reply to
Lanarcam

In article , Dave Hansen writes

good point.

How ever by coincidence today I have come across a company that has found using GNU has cost it dearly.

They have run out of code space. They are now in mid project going to have to port al the code to a commercial compiler that is more efficient.

--
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\
\/\/\/\/\ Chris Hills  Staffs  England     /\/\/\/\/
/\/\/ chris@phaedsys.org      www.phaedsys.org \/\/\
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/
Reply to
Chris Hills

... snip ...

Sounds like a silly characterization. They have obviously been able to produce in the past, with minimal toolchain expenses. The fact that they don't want to spend the effort building better code generators has nothing to do with that. If they wrote non-standard code and didn't keep the system specific stuff separate, that is their own fault. For all we know their bloat problems may have to do with the size and organization of the system library, or the unnecessary usage of big modules such as scanf and printf.

--
"If you want to post a followup via groups.google.com, don't use
 the broken "Reply" link at the bottom of the article.  Click on 
 "show options" at the top of the article, then click on the 
 "Reply" at the bottom of the article headers." - Keith Thompson
Reply to
CBFalconer

Yes, or just semi-competent programmers. I picked up a firmware project recently (in C) and fully expect to be able to *halve* the code size with a bit of obvious refactoring.

Reply to
toby

In article , toby writes

It still doesn't get away from the fact that the they reckon just be recompiling with a commercial compiler instead of GNU the code size was halved.

--
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\
\/\/\/\/\ Chris Hills  Staffs  England     /\/\/\/\/
/\/\/ chris@phaedsys.org      www.phaedsys.org \/\/\
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/
Reply to
Chris Hills
[...]

I'm sorry, but I just can't parse this. Can you try again?

Regards,

-=Dave

--
Change is inevitable, progress is not.
Reply to
Dave Hansen

That may well be. However, that has not "cost it dearly". In fact, it has cost them exactly nothing. They are way ahead of the game. If they were doing the complaining I would consider them ungrateful boneheads. I have no idea of your motives, but I am sure some exist.

--
"If you want to post a followup via groups.google.com, don't use
 the broken "Reply" link at the bottom of the article.  Click on 
 "show options" at the top of the article, then click on the 
 "Reply" at the bottom of the article headers." - Keith Thompson
Reply to
CBFalconer

If that's true (half the codesize just by switching compilers) I have a hard time imagining that only be due to miss-optimizations. There's a little-known fact of GCC: by default they put all non-referenced functions in modules into the final binary. This is due to their default sectioning of the object modules. To enable 'garbage-collection', you would have to specify:

-fdata-sections -ffunction-sections for the compiler and

--gc-sections for the linker

Do you know if they did that?

Regards, Andras Tantos

Reply to
Andras Tantos

Can you make up your mind - once they facing costly porting process and

12 hours later they halve the code just by recompiling... Am I missing sometning?
Reply to
Tom

In article , snipped-for-privacy@phaedsys.org

I would like to point out that the OP's message was dated on the 8th and didn't start to get any responses until mine (13th).

While your points maybe valid, it might have been more useful if you attempted to answer the questions asked (by the OP) instead of providing a sales pitch on why GNU tools may not be the best choice.

I at least attempted to answer some of his questions, I mentioned a compiler he COULD use (and it's up-front cost) and a potential IDE.

What compilers did you actually mention? Where is the associated cost? Where did you talk about this compilers speed of execution compared to another product?

HW.

Reply to
Hw

It's certainly implausible on any architecture I'm familiar with, but I defer to Mr Hills' ARM-specific knowledge.

Reply to
toby

It's hard to believe this, all our tests show that GCC produces almost same or little bigger code size than the commercial compilers in most of the cases. There are specific situations where GCC produces same or less sized code, there are others when GCC produces bigger code - everything depend on how much time you have spent to learn how your compiler assembly your code, where it puts the function parameters, how it compiles "case" and "if" statements etc. and to know where what to use. I'm not so good assembly writer as assembly needs you to be quite focused when you write or you easily shoot your leg, but few years ago I had to write CRC code for AVR assembler, then out of couriousity re-wrote the same on C and compiled with AVR GCC - it made the code

1/2 size than my original assembly code(!). On top of this in some commercial compilers when you put the highest optimization possible your code will stop work or start behave weird ;) so you should use the optimization with extra care (!) and spent more time on testing.

So bottom line is that GCC is doing just fine, but I can't say that there is decent ARM debugger on the market though which to work flawless in Windows environment (or I do asking too much ... ;)

Best regards Tsvetan

--
PCB prototypes for $26 at http://run.to/pcb(http://www.olimex.com/pcb)
PCB any volume assembly (http://www.olimex.com/pcb/protoa.html)
Development boards for ARM, AVR, PIC, MAXQ2000 and MSP430 
(http://www.olimex.com/dev)
Reply to
Tsvetan Usunov

On which architecture? GCC is embarassingly behind the best commercial compilers, especially on ARM, and it is questionable whether it could ever reduce the gap. My figures show it is at least 8 years behind state of the art - the code the latest GCC produces for Thumb is worse than that of 8+ year old ARM compilers, and Thumb code produced by GCC is *larger* than ARM code produced by the ARM compiler... Performance is about as bad as its codesize.

So I can believe halving the code is quite feasible - you can save about

30% in the compiler, and the rest is likely due to the libraries.

You should not need to change your code to suit your compiler - a good compiler can produce high quality code from any source, whether good or bad. Of course well-written source will produce good code on any compiler.

This is not true in general. In many compilers all (or most) optimizations are enabled by default - there is no point in adding optimizations if you then don't enable them! Many of the problems with optimizations are caused by programmers writing illegal C/C++ and expecting the compiler to not break their program. Any optimizations that go beyond the standard (eg. assume no aliasing) are typically well documented, and are only for programmers who know what they are doing.

GCC is just fine if you don't require the best compiler. You get what you pay for - it's that simple.

Wilco

Reply to
Wilco Dijkstra

Shouldn't the compiler and linker do this automatically? Removing unused functions is an essential feature that should be on by default.

It is well known that the default settings of GCC are terrible. Last year there was a paper on the GCC conference that showed how using 20+ options could reduce codesize by 5% on ARM. I don't know whether they have changed them to be the default since then, but that would be the obvious thing to do - I can't imagine anybody remember even half of them! I also wonder whether the latest GCC has finally dropped that expensive 60's style frame pointer...

Wilco

Reply to
Wilco Dijkstra

I'd like to see how the other compiler compares with GCC and option -Os. When looking at the generated assembly code (-S or -Wa,-ahlms), it's hard to believe that a compiler could generate 50 % smaller code if it compiles it all.

--

Tauno Voipio
tauno voipio (at) iki fi
Reply to
Tauno Voipio

Which default settings do you mean? Regarding optimisation, gcc "defaults" to -O0 if one forgets to specify anything else. Switching to

-O2 should make at least a 5% difference, but even 5% is well within the range of variation I'd expect from different vendors' compilers.

gcc targets an enormous number of architectures, and nobody can reasonably expect it to be the best compiler for all of them. Nonetheless it remains the de facto standard on many architectures simply because it does a pretty good and reliable job. Few applications could not tolerate 5% larger code, and those that cannot, should perhaps consider assembler or finding smarter C programmers, since that factor alone can certainly account for a 50%+ code bloat even with the best available compiler!

Reply to
toby

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.