Code Red for ARM Cortex M3 development - any good ?

On 7/10/2012 12:11 PM, Walter Banks wrote: [...]

the automotive area is definitely showing interest to the Open Source Software (even though they do not mention 'free' in the sense of 'libre'):

formatting link

I do not quite understand what 'whose code generation is weak' means. I compile my code on 20+ years old technology as well as newest technology. Except for minor 'warnings' that are not flagged anymore by new compilers - and that I fix nevertheless - I don't see any problem.

Reply to
alb
Loading thread data ...

What about the situation where it is impossible to find out who actually owns the rights to the software? (I was once in that situation). I ended up with scrapping the c-compiler and retaining the libraries. However I had to adapt gcc to change the calling convention, such that the libraries still could be used. (An other poster said that "very skilled" people can do that, thanks for the compliment. I want to say just one thing, don't be too intimidated.)

In that same company, they allowed me to buy a new SUN to run my compilers on. I ordered a c-compiler to go with the system. Installing gcc from source was a snap. Getting the new c-compiler to work with the license server over the network was not. After several calls to SUN's service department I gave up. SUN-gcc did a good job in compiling the above embedded gcc-compiler anyway. Who would expect otherwise? The license was never used.

Groetjes Albert

--

--
Albert van der Horst, UTRECHT,THE NETHERLANDS
Economic growth -- being exponential -- ultimately falters.
albert@spe&ar&c.xs4all.nl &=n http://home.hccnet.nl/a.w.m.van.der.horst
Reply to
Albert van der Horst

Weak in the optimization and creation sense. FOSS tools have a long ways to go to create truly modern tools.

To name a few, Most are still using technology that was designed to overcome development systems that had limited resources, like linking and asm code generation instead of direct to machine code.

Most FOSS tools I have used are shockingly slow . Application wide strategy passes before generation.

Primitive optimization and code creation using a subset of the resources on the target processor.

w..

Reply to
Walter Banks

I had a few years to learn them...

What adds to my file count is that being a low-level guy, I get to debug the low-level problems the other developers have. Which means, over time, kernel and driver sources accumulate in my environment :-)

It loads everything into RAM and then releases the file handles. It processes the files on first use (syntax coloring etc.).

Stefan

Reply to
Stefan Reuther

The Austin Group lists as members SuSE, RedHat, NetBSD, FreeBSD, Linux Standard Base. And if I look at C99, I find quite a number of gcc extensions in there.

Commercial tools put sentences into their manuals like "The compiler does not support C99.", or for C++, "The header and its functions are not included in the library. Two-phase name binding in templates, as described in [tesp.res] and [temp.dep] of the standard, is not implemented. A typedef of a function type cannot include member function cv-qualifiers. A partial specialization of a class member template cannot be added outside of the class definition." (From a document SPNU151G). And we're talking about C++98 only, in the year

2012! I would have expected 14 years to be enough to get conformant.

gcc's standards conformance is much better than that compiler's, and not worse than any other compiler's I've used.

The world is not black/white as you draw it.

For _formal_ conformance testing. This costs real money, and gives you a sticker for glossy brochures. FOSS has/needs neither.

I don't have the impression that gcc is particularily untested. My statistic currently is about fifteen bugs in commercial tools vs. two in gcc, even though I've been using gcc much longer.

Stefan

Reply to
Stefan Reuther

FOSS groups did not actively participate in C99 / C11. I would say there are some GCC extensions and some extensions that GCC adopted from other development work.

Conformance testing is one way to define with confidence the definition of a tool set. My surprise is that the FOSS community has not developed conformance testing tools in part to the response their cost. FOSS tools have been making the argument for free tools for years. .

w..

Reply to
Walter Banks

gcc comes with an extensive test suite, why does this one not count?

I am actually quite confident with it, simply because it has failed less often than "the others" on me.

Stefan

Reply to
Stefan Reuther

The GCC test suite is ad hoc, tests some things and misses others. What it lacks is tests that can be directly linked back to specific requirements. Tests that have names for example that go back to standards reference numbers (section paragraph numbering).

The GCC test suite is like a collection of regression tests that we have built up over the years as part of our internal testing , from customer support or just interesting code fragments that get debated in news groups.

Conformance tests should be written by someone that knows the requirements but have no knowledge of the implementation

Regards,

Walter Banks

Reply to
Walter Banks

I feel your pain. Luckily, I generally only have to solve my own problems.

That's the way it should be done.

That mirrors a question I had a few year ago. I asked why a data reduction program was reading bytes and words from a file instead of reading the whole file (a few hundred megabytes) into memory and processing the data using a pointer to the data. A PC will give you hundreds of megabytes of data memory if you ask nicely----why should we be content to process data a few bytes at a time using a file handle?

Having gigabytes of RAM to handle tasks has certainly changed the paradigm from the time when we had tens of megabytes of disk storage and a memory usage was measuresed in blocks of 64KBytes. Alas, many tools and operating systems still think of processing data via file handles and KBytes of memory.

Mark Borgerson

Reply to
Mark Borgerson

The fact that commercial compilers have bugs indicates that commercial conformance test suites also just test some things and miss others. So what's the point?

Recent compiler bugs I've encountered are one tool claiming this char foo[] = { "bar" }; to not be valid C++. It is. But probably the test linked to §8.5.2(1) did not catch it. Another one is a compiler crashing on class a { }; class b { }; namespace { class c : public a, public b { }; } Probably they've tested multiple inheritance, and anonymous namespaces, but it seems they didn't test the combination of both.

A systematic test suite is no silver bullet. And a suite that tests for previous problems is pretty useful. Whether it's more or less useful remains to debate, I consider it good enough.

Stefan

Reply to
Stefan Reuther

Eclipse works perfectly well with "makefile" projects. That way you get the advantages of Eclipse (it's code navigation and highlighting tools, jumping from compiler error messages to the code, debugging tools, subversion support, etc., etc., - there are lots of features according to taste), while still having proper control of your project build system. Eclipse's project management and build support is not bad compared to most IDE's, but you can always do better with your own makefiles.

Reply to
David Brown

Different types of test suite are useful for different purposes. And /no/ test suite is complete. So if a commercial compiler developer says they use Plum Hall (or whatever) to test standards conformance, all you know is that they have run the test suite. It doesn't tell you how much of the standards are really tested by Plum Hall (I'm sure they are good

- but they cannot ever be perfect). It might not tell you how much of Plum Hall the compiler passed or failed. It won't tell you anything about the bugs in the compiler, or the quality of the generated code or the libraries. It will tell you absolutely /nothing/ about the correctness of the compiler (or libraries, or code delivered with the toolchain) when you use different sets of compiler flags or options.

Conformance tests tell you a bit about the toolchains standards conformance. That can certainly be useful - but it is not a major part of being sure your tools generate correct object code from your source code. And if you are particularly concerned about conformance tests, then you should probably invest in Plum Hall yourself and run them (or get a consultancy firm to do so) rather than just relying on the toolchain manufacturer.

Of course, if you want conformance testing for gcc, and don't want to buy the tools yourself, you can always buy "commercial open source" toolchains such as provided by CodeSourcery / Mentor Graphics - they run their gcc builds through conformance testing tools, including Plum Hall.

Reply to
David Brown

Hey David,

Have you looked at ARMs distribution, perhaps as an alternative to codesourcery now it has been taken over:

(Or is it the same thing?)

John

--

John Devereux
Reply to
John Devereux

That is an interesting link, and I'll look more into it later. At the moment, it looks mostly like yet another source of pre-built gcc binaries - but having ARM behind it makes it stand out a bit. There doesn't seem to be much activity there as yet. And their testing appears to be just standard gcc testing, along with running some code on actual boards, in comparison to the additional testing (including conformance testing) done by Code Sourcery. Still, it's good that ARM are doing this, and I know that ARM make a lot of contributions directly to gcc development.

There are several factors that make Code Sourcery different. One obvious one is that they are a commercial company - they aim to get paid for their work, either directly (by subscriptions or consultancy fees) or indirectly (by making gcc better, more people will pay for subscriptions). That means top-level professional support, and extra services such as extra testing.

They also have a long history of working with gcc - if you look at the gcc mailing list archives or patch lists, you will see Code Sourcery email addresses all over the place stretching back for many years. To the extent that gcc has such a thing, they have long been the "official" maintainers of a number of gcc ports including ARM, m68k, PPC and MIPS.

Like many gcc and Code Sourcery fans, I was a bit concerned about how things would change once Mentor Graphics bought them. Mentor is a much larger company, and one might fear that they would be trying to close off the tools, forcibly integrate them into large, expensive toolsets, and discourage Code Sourcery employees from doing "free" work such as contributing to gcc or helping people on mailing lists. However, I don't think there has been any of this - Code Sourcery have being doing as much or more of their "pure" gcc work, the zero-cost "lite" versions of the tools have clear place on the website along with the paid-for subscriptions, and there are new cheaper versions of the toolchains to reduce the entry price. It looks to me like a very good match for everyone - Mentor provides a stability and "big" name, so that Code Sourcery can concentrate more on the technical stuff they are so good at, and all gcc users benefit.

Reply to
David Brown

That is what drew my attention too - interesting (and good) that they do this considering they sell their own compiler too.

They do seem to include some bug fixes on top of the FSF standard build (as do CS I believe).

Absolutely, I have been aware of them in that role for a long time.

Apart from "making" me sign up to their mailing list and receiving spam for PCB CAD, I have seen no averse effects yet either.

--

John Devereux
Reply to
John Devereux

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.