Gnu tools for ARM Cortex development

Is anyone out there doing development for the ARM Cortex (specifically the m3) with the Gnu tools?

Are you using the CodeSourcery set, or are you building your own?

If so, how are things going? There seems to be a welter of "how to" pages on this, but nearly all of them seem to be as old as the hills.

My spare-time job right now is bringing up a set of tools that'll work on Linux and will let me develop on the TI LM3S811. I'm trying to keep everything 100% open source; since CodeSourcery is exceedingly coy about coughing up source code (I certainly haven't found it) and because their install scripts don't seem to be terribly compatible with my Linux installation (Ubuntu Karmic) I'm building from scratch.

Things seem to be going well, although not completely straightforward -- my current task is to write or find the obligatory startup code to establish a C++ run-time environment so that the rest of my code will work, and to verify that OpenOCD really does function on this machine.

Aside from "you're crazy, see a shrink!" does anyone have any useful observations on the process? Any known-fresh web pages?

TIA

--
Tim Wescott
Control system and signal processing consulting
www.wescottdesign.com
Reply to
Tim Wescott
Loading thread data ...

I'm working with CodeSourcery Cortex tools at the moment, but I've barely got started. I have the "personal" license version, which comes with Eclipse and some integration, and extra debugger interfaces. In practice I suspect I will use the "lite" version as much as anything else, since it is not locked to any given system, and since I prefer more manual configuration and command lines anyway.

CodeSourcery does try to encourage people towards their paid-for packages, understandably enough. But their "lite" versions are totally free in cost, and pretty much entirely open source (there may be a few utilities or debug connectors that don't come with the source code, but nothing more than that). There are also source tarballs as well as binary installers for Linux and Windows.

Even their paid-for subscription versions have source tarballs - you can re-compile them yourself with the node-locking code disabled if you want (the debug sprites don't all have source code, however).

Also, everything CodeSourcery does in the main tools (gcc, binutils, gdb) goes back to the FSF trees - they are the maintainers for various gcc ports including ARM. However, their own branches that you get directly from them are updated faster, as they don't have to keep everything in sync for all ports, and they also have more comprehensive test suites (again, because they don't have to support everything at once).

Their discussion lists are open to all who register at their site, and the CodeSourcery developers are very helpful.

For building this sort of thing, I'd recommend using virtual machines - install Virtual Box (it's free, and runs on Linux and Windows) and make your build machines as virtual installations. That way you can easily try out different distros and keep your test builds isolated and under control. Sometimes these things work better with particular versions of particular tools (though gcc should build cleanly with most tool versions), and with virtual machines for testing you can avoid messing around with different versions of gcc on your main machine.

Just remember, you're doing this for fun - and building gcc /is/ fun (if you don't want the fun, pay your $400 and get a ready-to-run binary with support from CodeSourcery). Every hurdle on the way is a learning opportunity, and you'll end up understanding your tools better after you've built them. Also, keep notes under way - there's nothing more annoying than getting your build to work but forgetting what you did to get there...

mvh.,

David

Reply to
David Brown

I've used CodeSourcery light for ARM Coretex M3 and for ARM9. I haven't used either one extensively, but the seem to be solid enough to me.

I do both. For building my own, I've used home-grown scripts as well as Crosstool-NG. Both work fine.

Fine.

I agree that they don't make it easy to get source code from them.

For the lite edition, installtion consists entirely of unpacking a tar archive anywhere you like.

If you want to build your own with libc, crosstool-NG works well.

If you just want a bare-metal compiler (with no libc) that's pretty trivial to do on your own. Building with newlib isn't that much more difficult.

--
Grant Edwards               grant.b.edwards        Yow! Will this never-ending
                                  at               series of PLEASURABLE
                              gmail.com            EVENTS never cease?
Reply to
Grant Edwards
[...]

Sure? As far as I understand, you don't get the library sources in the personal version.

Oliver

--
Oliver Betz, Muenchen (oliverbetz.de)
Reply to
Oliver Betz
[...]

That seems like a lot of work - I just set the PREFIX in the build script and use that path subsequently in project makefiles. E.g. install to /opt/arm-elf-4.4.0. You can also set CC before building if you want to build with a different compiler version.

Lately I've been archiving the entire compiler along with each projects source code. That is, the stripped toolchain binaries are in a subdirectory of the project and are put under revision control along with it. Also there is a compiler build script as part of the project which can fetch the source code and rebuild the compiler if needed.

[...]
--

John Devereux
Reply to
John Devereux

I was thinking here about the tools themselves - gcc, binutils, gdb, etc. I believe you get the sources to some parts of the library with the personal version, but not all. As with many companies that make their livings selling toolchains based on gcc, CodeSourcery provide libraries that give you something more than the traditional C library (either more functions, better optimised code, etc.)

Reply to
David Brown

Nah, Virtual Box machines are very easy once you've tried it a couple of times. Typical Linux distros install quickly and easy since you've got no complicated hardware, a fast virtual CD (i.e., an iso file on your disk), and for this sort of thing you can completely ignore most user interactive software or configuration (no need to find yourself a theme that matches your office wallpaper). It is also very easy to take snapshots, archive your build machines, etc. And when you are following a how-to that starts "I used Fedora 10..." and you've got Ubuntu 9.04, you can just make a Fedora 10 machine and save yourself some work.

And of course, if you really screw things up, you haven't messed with your main system.

As I said, gcc typically builds cleanly with most tool versions, so a ./configure prefix is often enough. The Virtual Box setup works particularly well for more complex systems, such as buildroot setups for an entire embedded Linux system.

An alternative "lighter" solution is to use something like openvz - it's a sort of advanced chroot. There is much less overhead than openvz, but you still get to have separate distro installations in each openvz container.

That's a good idea - it means you always have access to the tools you used, even if you later are using a completely different system.

An alternative here is to do all your builds for a project within a virtual machine, and archive the entire virtual machine. It takes a bit more space, but is perhaps the most complete archive of the build environment.

Reply to
David Brown

VirtualBox is geat - I do Visual C++ development in it on my debian systen. It is also very good for testing Windows software releases, on multiple versions of windows and letting you quickly roll back the installation process each time.

[...]

With git doing the revision control it is very fast and compact too.

But will your copy of VirtualBox 10 years from now be able to read todays virtual machine snapshot? Aha, but you will be able to install a copy of todays VirtualBox on a new virtual machine, and use that! :)

--

John Devereux
Reply to
John Devereux

Yep...

CodeSourcery is fine for me...

I agree.... I have just setup an environment with: CodeSourcery + Eclipse CDT + GNUARM eclipse plugin

In ArchLinux (and windows) the installation is ok...

I have an ST-LINK JTAG so I used the ST-LINK_gdbserver, not OpenOCD

Well.... the steps for me:

0) install CodeSourcery 1) install Eclipse 2) install Eclipse C/C++ Development Tooling - CDT 3) install GNU ARM Eclipse Plugin
formatting link
4) install CMSIS and the StdPeriph Driver 5) take a standard linker script for STM32 (do not forget ENTRY_POINT) 6) configure everything :)

Enjoy...

-- Carlo

--- news://freenews.netfront.net/ - complaints: snipped-for-privacy@netfront.net ---

Reply to
Carlo Caione
[...]

for these, there seems to little or no difference between the versions.

"Professional Edition also includes debuggable versions of the run-time libraries".

I will ask them when I start again an evaluation of the differences. When I tried last time, urgent other work prevented me from finishing my tests.

Oliver

--
Oliver Betz, Munich
despammed.com might be broken, use Reply-To:
Reply to
Oliver Betz

There are three differences that I know about between the paid-for versions of these tools, and the entirely free versions. One is that the paid-for versions include the node-lock and license validity checks (which can be removed if you are recompiling them, and feel you want to "get around" the licensing). Another is that the paid-for binaries go through a more thorough testing and validation process. But perhaps the most importantly, CodeSourcery updates their paid-for versions faster than their fully free versions.

That always seems to happen just after you've registered for your 30 days trial...

Reply to
David Brown

Tim Wescott skrev:

Would be interested to see some performance data vs commercial compilers. Some tests I found on internet (ST's Forum), indicate that older gcc really suck for Cortex-M3 (~0.4 Dhrystone MIPS/MHz IIRC).

Which gcc version do you need to get decent performance?

--
Best Regards
Ulf Samuelsson
These are my own personal opinions, which may
or may not be shared by my employer Atmel Nordic AB
Reply to
Ulf Samuelsson

As with any other compiler, early versions of gcc for a particular target have often been poor. One of the differences between gcc and commercial compilers is that gcc releases are often available even in their earliest versions, while a commercial company will probably not release their tools until they are happy with the performance. There are also many different versions of gcc around - for various reasons, people will sometimes choose old versions of the tools.

Another difference between gcc and commercial tools is that commercial tools often have EULAs restricting you from publishing any sort of benchmark information (this is understandable from the supplier's viewpoint).

Finally, commercial tool vendors have a strong need for competitive marketing, and therefore to tell users how much better code their compiler produces. gcc, even from commercial companies, are not in the same situation - their main "competitor" is older versions of gcc.

All this adds up to it being very common to see "benchmarks" showing that brand X compiler generates faster code than brand Y or gcc. When you look at the details (if details are even shown), brand X is probably the latest version with the fastest choice of compiler flags, while brand Y and gcc are often older versions and poorer flag choices. The source code used for the tests is typically meaningless (who really wants to calculate lists of primes on a microcontroller? And why does "printf" turn up so often in a /compiler/ test?) and chosen to fit the results the tester wants.

If you want to know which compiler does a better job, the only way to find out is to get some evaluation copies and do the comparison yourself. It would be nice if there were such information available on a website, but it would take a lot of time and effort (and therefore money), especially to keep it updated, and would break the vendors licensing agreements.

Reply to
David Brown

In message , David Brown writes

I agree... however it does not stop benchmarks being done. Most compiler companies test all the competitors compilers they can get their hands on. I have seen some of these tests. (I am under NDA's with various companies and have done compiler testing) They are all better than GCC equivalents.

That is unless you spend a LOT of time (and time == money) improving the GCC compiler set up. Then it gets better but rarely as good. It is essentially a generic compiler system it is not going to get anywhere close to the targeted commercial compilers.

What most commercial compilers companies do is tell their users how to get the best out of the tools. That is true.

Or other suppilers of the same (ish) version.

They always say that. However the internal tests and bench-marking use the current and main stream GCC compilers which suitable flags set for all. There is no point in doing otherwise for internal testing and beach-marks you can't publish

There are many benchmarks. Each tests different things. Apart from he obvious whet and dhry stones, sives and primes etc there are a lot of other benchmarks used. Certainly internally. Quite apart from Language conformance tests.

Very true. I once saw some one who upgraded because the new version of a compiler said it could to *on average* 10% reduction in code size. HE complained because he got a 1% reduction.... When we looked into it 90% of his code was look up tables!!!!

So try the compiler on YOUR code. That is what eval versions are for. Most do a size limited version and or a time limited unrestricted version

There is... but

Not a chance because.....

Time == money

Yes.

Incidentally I had a chat with a company whose legal department went thought the licenses* for some Open Source they wanted to use. Apparently it was "far too restrictive" and they refused to permit any Open Source in the company!!!!

They did not say which Open Source License(s) it was.

--
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\
\/\/\/\/\ Chris Hills  Staffs  England     /\/\/\/\/
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/
Reply to
Chris H

I'm not going to argue here about how gcc /actually/ compares to other vendors - we've heard each other's opinions on that before. And I haven't done any comparisons myself on the ARM platform, so I don't have any facts on hand.

What I am saying is that you cannot place much trust in a compiler vendor's benchmark publications.

I am not trying to accuse them of dishonesty or anything (unless you consider all marketing dishonest) - it's just that it's somewhere between very hard and impossible to do a good job of such benchmarks, and there are too many conflicts of interest involved.

I'm sure that /internally/ vendors take gcc a lot more seriously. I'd be very surprised if they don't do close comparisons on all sorts of generated code using their own tools and gcc, and I'm sure there are times when they study the gcc source code for ideas (they won't copy anything, of course - it would not fit in their code structure. And it would be illegal).

But that's different from published benchmarks. Internal benchmarks are a development tool written and used by engineers. Published benchmarks are a marketing tool for salespeople.

I've often heard you say that this sort of benchmark proves that gcc can't compare to commercial compilers. But I'm sure you'll agree that a statement like that on its own is worthless - and your NDAs prevent you giving anything more. Even if you are able to provide your customers or potential customers with more details "off the record", anyone interested in checking the performance of a compiler has to do the tests themselves.

Some open source licenses have a lot of restrictions, but these are mostly on use of the source code. There are few which have restrictions on /use/ of the program. In fact, to get OSI approval of a license as "open source" (tm), you are not allowed to have restrictions on who can use the software - see for the rules.

Of course, there are legal departments, managers, etc., who come up with all sorts of bizarre rules based on their understanding or misunderstanding of things. I've heard of companies refusing to use free software (open source or otherwise) because if they haven't paid for it, there is no one to sue if it goes wrong!

If this particular story is referring to software development, then it's a different matter. Trying to make use of existing open source software in the development of your own products can be a legal minefield, especially if you want to mix and match code with different licenses. And in this context, people often consider the GPL to be very restrictive, especially compared to BSD licenses.

Reply to
David Brown
[...]

does this apply to well maintained 32 bit targets as m68k/Coldfire, ARM?

General code generation problems or library quality?

Oliver

--
Oliver Betz, Muenchen (oliverbetz.de)
Reply to
Oliver Betz

In my experience, gcc produces very good code for general C (and C++, Ada, Fortran, etc.) for the main 32-bit targets, such as m68k/Coldfire, ARM, MIPS, x86, and PPC as well as the 64-bit targets PPC, MIPS, and amd64.

There are some aspects where top-rank commercial compilers will do a better job - their support for "cpu accelerators" such as extra DSP or vector units is often better, especially if there are few chips with these units. In general, the more specialised an add-on is, the less likely it is that gcc will fully support it. Support for that sort of thing takes time and money - commercial tool vendors will often get support from the chip vendors, and they have customers happy to pay money for exactly this sort of thing. With gcc, such features depend more on popularity and on the number of paying customers at commercial gcc developers like Code Sourcery. Hardware vendors may also sponsor such features.

On the other hand, more development is done on the front end of gcc than for most other compilers. The front end is shared across all ports, and is thus probably more used than all other compilers put together. So there is a lot in terms of language support and extensions, as well as front and middle end optimisations, in which gcc leads many commercial compilers.

One area in which gcc has been poor at compared to commercial compilers is whole-program optimisation (a.k.a. link time optimisation, inter module optimisation, omniscient code generation, etc.). For several versions, gcc has supported this to a limited extent - basically, you compile all your C files at once with a few compiler flags. But you couldn't split the compilation up, you couldn't mix in C++, you couldn't use it on libraries, and it didn't scale well (that's less of a problem with many embedded systems, but limits the scope for development and testing since it is of little use on "big system" software).

With gcc 4.5, there is now proper link-time optimisation. It remains to be seen just how good this will be in practice, and it will probably take time to mature, but the potential is huge, and it could lead to many changes to the way C code and modules are organised.

Libraries also vary a lot in quality. There are also balances to be made - libraries aimed for desktop use will put more effort into flexibility and standards compliance (such as full IEEE floating point support), while those aimed at embedded system emphasis size and speed. This is an area where the various commercial gcc vendors differentiate their products.

Reply to
David Brown

I agree. The published ones are not of any real use.

I agree. And that applies to the non-commercial tools as well. BTW whilst the marketing from some companies is "close to the line" (and some times close on the wrong side :-) Open Source Devotees can be just as bad and often far worse in their claims and arguments. They make religious zealots look sane.

Not technically. It is way behind most commercial compilers and for many targets there are no open source or free compilers. One compiler designer I know was complaining last year that all GCC is doing is rearranging the deck chairs on the titanic when it comes to compiler technology.

They do.... but not "and gcc". Gcc is just one of many compilers a compiler company will test starting with their main competitors.

There is no need except for amusement. GCC is a LONG way behind the main commercial compilers.

Yes and not. All benchmarks are benchmarks. The published ones tend to use well known benchmarks where the source is public. Internal benchmarks use all sorts of code. I know one company who uses several very large projects from customers as well as the sources for their own tools.

It does.

Yes to a point.

Yes. But who does? Apart from the standard published benchmarks which are very narrow.

--
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\
\/\/\/\/\ Chris Hills  Staffs  England     /\/\/\/\/
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/
Reply to
Chris H

In my experience that true. The technical issue with gcc is the fundamental design is very old and it tries to support many targets with as much common code as possible. These are primary weaknesses when gcc is competing with commercial compilers designed for a specific target.

I don't think that many commercial compiler companies would be looking though gcc source code for ideas anymore. The approaches that commercial compiler developers are using now using is so fundamentally different that gcc would be of little help.

Regards,

w..

-- Walter Banks Byte Craft Limited

formatting link

--- news://freenews.netfront.net/ - complaints: snipped-for-privacy@netfront.net ---

Reply to
Walter Banks
[...]

what I have seen in my tests till now looked good, besides a strange re-ordering of instructions making the generated code not faster but unreadable (e.g. in the debugger).

And it could be that the re-ordering affects performance when accessing slow Coldfire V2 peripherals (consecutive accesses to peripherals cost more wait states), but I didn't investigate this yet.

[...]

since this affects mainly code size, this is no problem for me. My applications are small and time critical, so I need only speed.

[...]

newlib, uClibc? IMO still bloated for small applications.

At least Codesourcery doesn't tell much about specific advantages of their libraries.

And since the libraries have to cover a broad range of applications, it might be necessary to compile them with specific settings - who provides sources?

Oliver

--
Oliver Betz, Munich
despammed.com might be broken, use Reply-To:
Reply to
Oliver Betz

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.