Richard Stallman is responsible for the shrinking economy

[...]

There are at least two answers to this:

1) You don't need the resulting compiler to be bitwise-identical, just the output.

2) If you are paranoid you could even compile a bitwise-identical compiler from its source, by archiving the build system and/or using virtualisation to run the ancient system.

But in fact this is all irrelevant since you are comparing two completely different things - an end user generally cannot compile a commercial compiler *at all*.

A *fair* comparison would be to compare the difficulty of running a dongled binary copy of an old commercial compiler with that of running a binary copy of an old gcc compiler (on a new machine).

I have done this several times, and I find that gcc always works, and the commercial compiler will very likely not work. The copy protection always seem to rely on some tricky mechanism that breaks if everything is not exactly right (hardware, OS). Whereas running a gcc binary installation only requires very basic standardized OS support. At the end of the day, gcc is designed to work everywhere it can. Whereas many commercial compilers are *deliberately designed* to fail if copied!

No doubt you will say that the vendor can supply an old unprotected copy? All I can say is that is not my experience. With IAR my only option was upgrading to the new compiler version - when I did it would not even compile my code, let alone be "the same compiler". And this was only a couple of years later, not 18.

--

John Devereux
Reply to
John Devereux
Loading thread data ...

I'm Scottish (which explains our general agreement on most topics that don't involve compilers and/or open source), but I live and work in Norway.

That is, of course, a complicating issue.

I'm thinking about countries that have freedom of speech (and freedom

*after* speech). In your list above, I think that applies to South Korea, but I'm not too sure.
Reply to
David Brown

but if no cosumers get paid a disposable income to buy such things, then the idea fails as the effective relative cost becomes greater of the un-reducable/un lowerable fixed costs of essential staples such as water and housing.

You don't see much rent reduction happening in them there ghost towns.

cheers jacko

Reply to
Jacko

Economically essential does not cover the situation of executive toy maufacture for the lowest price. it seems like the only market which will be left, apart from the staple price rocketing food market. If the free markets task is to make people un-employed, rather than employed in real work, then the situation of 1% rich 99% poor is enevitable. The 99% poor will not aford to start any enterprise, so are doomed to bad health and death in system poverty. If this is progress, then I'd hate to see the revolutions retrograde economics would produce.

And pay by the hour will kill all in time.

cheers jacko

Reply to
Jacko

Shrink wrap is dead for environmental reasons. Service industy model, neglects to take into account that something has to be served, and all that is served has to be paid for, by sombody, with something.

The trickle down model suffers from the lack of understanding of the size of the glasses of greed.

cheers jacko

Reply to
Jacko

Any why should it. To protect the dolearitariate of the cities?

cheers jacko

Reply to
Jacko

If you are talking about a safety-critical system (or other high-reliability system), and end up in court because of a software failure, then of course you will have to justify the decisions made during development. That applies whatever development tools you use - commercial or open source. The procedures you use, and the level of your own testing, verification and documentation will depend somewhat on the tools you use.

Of course, if you are up in court because your shareholders are suing you, you may be asked to justify paying tens of thousands per seat for a compiler and licenses for a validation suite for use in making a pocket calculator. Not every project has the same requirements of validation. As always, you must choose appropriate tools for the job.

But be clear that open source software, open source compilers, and open source operating systems are very much in use in safety-critical, high-reliability, high-security and mission-critical systems. It is certainly true that open source software (both for development tools, and for the software itself) is more prevalent on larger systems than smaller ones.

Hardware companies like Intel, Atmel and Altera are not going to compromise their reputation for robust and reliable devices by supporting and encouraging poor quality compilers and other software - yet they all strongly support Linux (both as a development platform and as a native platform) and gcc - in some cases as their *only* compiler. Software companies like Metrowerks and Wind River are not going to compromise their reputations, yet they strongly support open source with development tools and Linux systems. Wind River happy recommend their real-time enhanced Linux for aerospace, defence, and automotive applications.

You might not like it, but the world today relies on open source software, and gcc, just as much as it relies on closed source software.

I note that your website runs an open source webserver on an open source operating system compiled with an open source compiler. It's not safety critical, but I'd guess it's fairly important to your business. I also note that on that site you sell embedded products running Linux, and development tools that run on Linux. I also note that you have a compiler validation service - perhaps you could make some money by offering to run validation checks on gcc rather than just bemoaning the lack of such checks.

Reply to
David Brown

Shup totalitarianism.

cheers jacko

Reply to
Jacko

"satisfied users".

One who gets paid instead of having to buy ;-)

Reply to
Jacko

All my opinions are my own as well as the opinions of K Ring Technologies. It does help being the only chritable director, i.e. the only worker for...

formatting link

cheers jacko

Reply to
Jacko

Actually you have to justify them at the time of certification.

You assume that GCC is cheaper than buying tools. Compilers are less that 4000 per seat so where is this emotive " tens of thousands per seat " coming from?

However as you point out validations suites are not cheap and you will need one to run on your GCC compiler to see that it even works.

Commercial compilers costing 3K have already been tested.

Such as?

There are a couple. They also run on Windows.

OK... I will validate GCC... however the cost will be something like 50 times more that the cost of validating an IAR or Keil compiler. You pay we will validate it. Send in an order. And your compiler. NOTE the validation will only apply to THAT particular GCC binary.

--
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\
\/\/\/\/\ Chris Hills  Staffs  England     /\/\/\/\/
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/
Reply to
Chris H

That sentence read "for a compiler and licenses for a validation suite"

I must admit, the fact that all the C/C++ in Debian Stable, which I am running, was compiled with the current version of gcc shipped in Stable, leads me to think the compiler works. But I'm not used to compilers where, having installed them, you have to prove they work before you can use them.

I am used to a compiler where I get the test suite containing tests for the bugs that were known in previous versions with the compiler, though.

ADA?

--
rich walker         |  Shadow Robot Company | rw@shadowrobot.com
technical director     251 Liverpool Road   | skype: rich_at_shadow
need a Hand?           London  N1 1LX       | +44 20 7700 2487
http://www.shadowrobot.com/hand/            | +44 7866 561014
Reply to
Rich Walker

Why not? Happens all the time in open source software. There were still patches to the Linux 2.0 kernel when 2.6 was long released. And it's not uncommon for commercial software as well. Windows 7 will be long out when there will still be fixes to XP.

Using a version 1a that fixes just one bug makes perfect sense. Maybe you do not want version 2's fixes, because those break your program. Or you don't want version 2 because that no longer supports the chip variant you're using.

Of course, everyone but you is incompetent. I forgot. Still, I was quite certain that the compiler I use was in error when it decremented the 'for' induction variable before the loop body. I was also quite certain that it was in error when it removed a boolean loop guard variable which I did in fact modify in the loop.

This is the very same situation I get when I fix a bug in my embedded program. I might have fixed it, or I might have made it worse. My boss pays me for knowing how to fix it in predominant number of cases (he is realistic and doesn't pay me for being absolutely perfect).

Remember, when I find a bug in an "officially tested" compiler, I have just *proven* that the testsuite does not guarantee correctness. I have just proven that it is merely a stochastic set of tests checking some random invariants of the compiler, which happened to miss my case.

gcc's testsuite does the same. It tests some random invariants of the compiler.

My current score is about six bugs in commercial compilers versus one in gcc, the latter with a dubious template construct, the former with everyday code.

And the gcc folks didn't bill me for the upgrades.

You know what? I don't care about formal test results in glossy brochures. I care that the compiler works. I care that it understands my source code in the same way as I do, and generates code that works. I care that it honors the platform ABI.

Sure, Plum Hall may make this more likely, but it does not guarantee it.

The point is not doing it without reason. The point is doing it when everything fails, and when the alternative is to scrap the project completely.

If I have a big program that compiles only with gcc-1.37, I take gcc-1.37.tar.Z, try to compile it on my new machine, maybe port it, and then use it (actually, I compiled it two years ago; I didn't have to change a line to compile it on Cygwin, which didn't even exist when gcc-1.37 was written).

If I have a big program that compiles only with Turbo C 1.0 from the same era, I'm lost. Even when I can manage to get a copy of Turbo C, Vista 64bit won't run it. Okay, I can start emulating with DOSBOX etc. Fortunately it doesn't have a dongle...

The same can happen with commercial software.

Stefan

Reply to
Stefan Reuther

From reality. Even Microsoft Visual Studio can go far beyond your $4000.

Commercial compilers costing $10K+ have already compiled bugs into code which gcc compiled correctly. So what?

Stefan

Reply to
Stefan Reuther

Thanks. That makes sense.

What about in cases of an evaluation copy? When no money has been exchanged and when the intent is merely to allow someone to evaluate the version and when the product is hampered against commercial use anyway, it seems very difficult to seem what the basis might be for prosecution. This might even be considered "political free speech" in the US, if one could make the case that they are working by publishing results to force/encourage political/legal change here -- which is the most protected form of speech that exists in the US.

Yes, that made abundant sense to me from the start.

In addition, I can't recall how many times I've had compiler vendor support folks tell me that something is done a certain way because that's what counts in their competition with other tools or with their customer base. So I suppose everyone seems to have excuses for deviations.

Recently, Tim Wescott posted a link to a PDF on volatile which claimed that it wasn't handled correctly on just about every compiler they tried it with. I think the paper focused more on GCC, though, so maybe I'm doing my argument a favor mentioning that paper.

And I happen to like the addition of binary specified constants. There are times when they are MORE readable and I see no reason not to support them. I wonder why the standards processes failed application developers here.

A GPL'd tool like Plum Hall could be developed, though. In fact, if the standards committees were sufficiently concerned about rogue development by non-participants in their processes, I would suppose they would be perfectly able to fund the provision of one into the public domain. And/or make a public case asking for donations to do that. I'd contribute.

I'm not familiar with CodeSourcery. I'll have to look it up.

Jon

Reply to
Jon Kirwan

I've certainly resolved issues with GCC by searching the web. Sometimes I may have felt like the only one using a particular setup (being on Cygwin didn't help :-), but there are a lot of patches out there, and earlier/later versions to try out.

Of course, if you've got a commercial compiler that has a bug, chances are you won't get a fix for it either. At least not quickly, or without porting to a later slightly incompatible version or the compiler/library, or dealing with new bugs in the new version, etc. It's usually more cost effective to just change the source code to avoid the compiler bug, a tactic that works well with GCC also.

Reply to
Darin Johnson

... snip ...

I don't know if you are familiar with the Pascal test suite, which used to be publicly available, but disappeared. I no longer have it, and am speaking from 30 year old memory.

That was allied with the language specification, in that tests were numbered to agree with the standard, followed by a serial number when multiple tests were needed for an area. This was delivered in text, together with code to extract individual test as source, compile them, and compare results. The run could be required to fail or succeed, tied to the actual test number, so the testing code could act accordingly.

I suggest that a useful product could be generated from the existing gcc suite, starting by extracting tests, numbering them appropriately, injecting them all into a master text file, and preparing the auxiliary code as above. There should be no need for non-standard code outside of the execution set, and this could be limited to a few suitable functions. Once the elements are ready extraction and application of further test should be relatively easy.

I want to restrict this to testing for standard compliance. Further aspects can be added later, provided suitable facilities (mainly for selection) are present in the auxiliary code.

So far provisions only need be made for C90, C95, and C99. Another will be due sooner or later. C95 may not be needed.

--
 [mail]: Chuck F (cbfalconer at maineline dot net) 
 [page]: 
            Try the download section.
Reply to
CBFalconer

... snip ...

I don't think Mr. Brown really is recommending that they do such work. The point is that, if the source code for the original compiler is (or was) available, and the firm had taken the appropriate precautions, they could handle the problem (or find someone to do so for them).

--
 [mail]: Chuck F (cbfalconer at maineline dot net) 
 [page]: 
            Try the download section.
Reply to
CBFalconer

That has nothing to do with the problem. If they don't understand, they probably cannot fix. However, without source they certainly can't fix.

And who are you to arbitrarily consign somebody to the 'fools bin'. And this is one more reason to have a GPLd version of a testsuite, that checks standard compliance only.

--
 [mail]: Chuck F (cbfalconer at maineline dot net) 
 [page]: 
            Try the download section.
Reply to
CBFalconer

In message , Jon Kirwan writes

It did. The paper basically said that on all the [gcc] compilers they tested volatile was handled incorrectly , the errors varied from minor version to minor version of the same compiler for the same target.

Before you ask the commercial compilers I know of do test volatile for correct behaviour.

But it hasn't been. The closest anyone has go to that is the splint tool and that has apparently been dead for a couple of years.

This has caused a couple of FOSS devotees to point out that it proves static analysis is of little use or it would have been developed further. Interesting as all the serious testing and case studies who the opposite.

Nope... None of it is paid work. You get to do it for FREE. (Just like FOSS)

Go on then start one

You will need some 65,000 plus tests and show exactly which standards you are testing against.

You will also need to have it accepted by people like TuV, NIST etc

--
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\
\/\/\/\/\ Chris Hills  Staffs  England     /\/\/\/\/
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/
Reply to
Chris H

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.