Richard Stallman is responsible for the shrinking economy

In some other post I told about me modifying a 68K cross compiler. It was actually running on SPARC's, and I had the company I worked for order another SPARC and while we were at it another c-compiler as well, such that my work wouldn't back-fire on other production work.

It was so problematic to get the license server allow me to use that c-compiler, that we threw the license in the bin and used GNU instead. The cross compiler was GNU anyway.

Horror stories like this abound. I find it just not credible if a vendor says that clients in general don't worry about dongles.

--

--
Albert van der Horst, UTRECHT,THE NETHERLANDS
Economic growth -- like all pyramid schemes -- ultimately falters.
albert@spe&ar&c.xs4all.nl &=n http://home.hccnet.nl/a.w.m.van.der.horst
Reply to
Albert van der Horst
Loading thread data ...

As in economical value on a per copy basis. Yes.

Let us have some IP in a free market. We have Madonna sing a song. She is a good singer, but there are more like her. It takes an hour. Pay her 10000 bucks, I'd say well-payed. About as much as a first class escort.

Now distribute the song, say over the Internet to about 10,000,000 people. They pay 0.1 cent each plus distribution, virtually nil. It could be 0.01 cent for 100M people. So IP has not a zero economic value, but it is vanishingly small, especially in the long run, and for a wide distribution. (That is why it has to be beefed up a bit -- not too much please! -- by copyright law.)

There is a free (as in near zero cost) alternative: Ubuntu. So the answer is: "If MS wants to stay in business. Yes." Let them earn their money, for once, by giving support.

Simple free market logic.

Groetjes Albert

--

--
Albert van der Horst, UTRECHT,THE NETHERLANDS
Economic growth -- like all pyramid schemes -- ultimately falters.
albert@spe&ar&c.xs4all.nl &=n http://home.hccnet.nl/a.w.m.van.der.horst
Reply to
Albert van der Horst

But Walter, I can't count the number of times I've seen at least one compiler vendor (one I'm thinking of now, you know well I suspect, and is also a writer/owner for whom I personally have a lot of respect as I do you) publicly state that C99 is a mess/disaster and he won't support it. I don't know about all other compiler writers' opinions, since I haven't done a survey of such people, but I at least grant that he is making a point born from good experience. I know he isn't the only compiler vendor who has made such a comment, either. So there is at least more than one commercial companies' leaderships saying so.

Are you improperly laying this burden entirely upon a group that doesn't deserve to be singled out?

That would leave me out, obviously. I can only read what is published. In science, it's "publish or perish" for a very important reason.

That seems to be coming from the perspective of a commercial vendor, though. At a university, motivation is different. You either publish .... or else. I think of basic research as the kind of thing that universities are there for and that although some corporations do indeed do basic research, where and when it profits them to do so, it's not their primary purpose.

Jon

Reply to
Jon Kirwan

^^^^^^^^^ That's me.

^^^^^^^^^^^^^^^^^^^^ That's me

^^^^^^^^^^^^^^^ That's me

^^^^^^^^^^^^^^^^^^^^^^^^^ Now check out my website below.

You miss the point. It is a service industry, dude. Shrink wrap is dead.

Groetjes Albert

--

--
Albert van der Horst, UTRECHT,THE NETHERLANDS
Economic growth -- like all pyramid schemes -- ultimately falters.
albert@spe&ar&c.xs4all.nl &=n http://home.hccnet.nl/a.w.m.van.der.horst
Reply to
Albert van der Horst

Successful memes do. And Stallman's vision helped to create such a successful meme that has propagated and replicated well. You are right that it isn't going away any time soon.

Jon

Reply to
Jon Kirwan

It exists. gcc can be told to comply with K&R I, C90, C95, C99, and 'GNU c'. And various levels in between. Almost everything can be switched in and out from the command line.

--
 [mail]: Chuck F (cbfalconer at maineline dot net) 
 [page]: 
            Try the download section.
Reply to
CBFalconer

... snip ...

You have a funny definition of 'rude'. I remain polite at most times, give referances, and say please. I guess to you 'rude' means any disagreement.

--
 [mail]: Chuck F (cbfalconer at maineline dot net) 
 [page]: 
            Try the download section.
Reply to
CBFalconer

If it was GNU it was under the GNU license, and there was no need for you to pay for, or have, a license. Assuming I have the correct situation in mind, I think you should have reported it to GNU and let them correct the misuser. Possibly by suits.

--
 [mail]: Chuck F (cbfalconer at maineline dot net) 
 [page]: 
            Try the download section.
Reply to
CBFalconer

So how many times do you need to be told someone disagrees with you. You knew - you have known it for many years - that this guy

*knows* how you want him to post and he happens to disagree with that sometimes. But accepting someones disagreement appears to be too novel an idea for you, so you copied your message to him yet again.

And you claim not to understand this is rude. Yeah. So you have never been taught this: it is OK to ask a person for something *ONCE*, if you don't get what you asked for just *SHUT UP*. Simple algorithm, most people learn it before they reach 7.

Dimiter

Reply to
didi

There are people still using commercial for-pay compilers and tools. However, the number seems to shrink as more people learn that commercial does not mean better. In the past, a lot of sites were worried about using open source variants, with the typical concerns about lack of support, or similar issues. Ie, people expect a better product when they pay more, and have to learn that quality and price do not correlate.

Perhaps a bigger problem is with the commercial vendors who do not make competitive products? If they're worried about the competition, they can make better products in response. It seems like the innovations from commercial vendors tend to be things like IDE integration or misc utility tools, because these are easy to demo, whereas users would prefer seeing better support, better optimization, better speed.

I kid you not, at a past place of employment, we switched from one commercial PowerPC compiler to GCC+make, and our builds went from overnight to less than a couple of hours, and we got faster code.

Later we were trying to migrate to GDB, and our current debugger vendor refused to let us have an unattended trial with their newer jtag solution, and actually strongly hinted that we would have to pay (on top of the $5K/seat) to have an engineer help us while we evaluated the product if it took more than an afternoon.

I fixed a bug in one commercial RTOS, then opened a bug report with the vendor, and listed the file and line number and fix. Their response was along the lines of "if you've got the fix already, what do you want us to do?"

Sure, it takes some of my time to support GCC and GDB, but the payoff to the company is huge in return. Consider that the vast majority of sites rarely get any help from their support contracts. Especially if you are a small site with few plans to expand, the vendors will not waste much time on you. These support contracts end up as just feel-good insurance plans that never pay out. All that most people get from these contracts is permission to upgrade to later versions; exactly what you get from open source for free.

I know of sites that would be quite willing to pay for commercial products even if they only offered marginal improvements over open source. But when the open source is better in almost every way to commercial products, is it any wonder people are looking more seriously at it?

I think the situation is a bit similar to the big 3 auto makers. They offer an inferior product, then rely on our patriotism to keep buying from them, and predict doom and gloom if the customers look elsewhere.

Reply to
Darin Johnson

Also Walter is a very well informed person. I would rather have him contributing and to posting that not contributing at all. Besides as He has been on line longer than any one else here I think he has earned the right to do it any way he pleases.

--
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\
\/\/\/\/\ Chris Hills  Staffs  England     /\/\/\/\/
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/
Reply to
Chris H

It doesn't matter how many there are. It matters from how many pre-existing compilers those were built, recursively. Also because given a compiler design with an architecture-independent intermediate object format (like GNU BFD), Ken Thompson's back door might propagate to cross-compilers as well.

How so? Once the hack is there, code revisions don't matter much. (As long as the compiler can recognize itself.)

--
Gemaakt met Opera's revolutionaire e-mailprogramma:  
http://www.opera.com/mail/
Reply to
Boudewijn Dijkstra

In message , Boudewijn Dijkstra writes

Many compilers but none that are vulnerable to the Backdoor method. In theory any commercial compiler could be but that would have to be installed by the vendor. Also the vendor has no idea what RTOS you may or may not be compiling. It has virtually zero effect. Even if MS did it to MSVC++ etc it would have a very limited effect because no one rebuilds the MS OS

The Thompson back door worked because the compilers would be used to compile UNIX. This is in the same way than most if not all versions of Linux are compiled by GCC... SO if you can infect one GCC it will spread to other builds of the compiler and to the Linux OS... the problem is that unlike a commercial compiler company doing it ANYONE can introduce it to the compiler in many places asynchronously. There may be one back door hack out there, there may be a couple of dozen variants.

--
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\
\/\/\/\/\ Chris Hills  Staffs  England     /\/\/\/\/
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/
Reply to
Chris H

In gcc 4.3, there is the "-fgnu89-inline" switch which controls the compatibility - by default (with gcc 4.3 onwards) in C99 mode, you get C99-style inline. gcc has a great many compatibility switches, but obviously they must be conservative about changing defaults.

Reply to
David Brown

That's true, and you can see it in the change from GPL2 to GPL3 in many FSF projects. Theoretically, it also allows the FSF to release all new versions of gcc under a commercial-only license if they wanted - but previous GPL'ed versions cannot be affected.

You can see that in the Linux kernel, which is GPL2-only (rather than the usual "GPL2 or newer"). It could not practically speaking be changed, even if Linus and the other key developers wanted to.

Reply to
David Brown

The reason why copyright assignments are required is to prevent someone from claiming ownership to the code and closing the source, i.e. what SCO tried to do with the Linux kernel. As you noted yourself, the unmodified GPL license already allows you to switch to a later version of the license. I'm afraid to even imagine the circumstances under which the FSF would switch over to a closed license.

-a

Reply to
Anders.Montonen

SCO tried to claim that they owned the copyrights to some code, which others had released under the GPL - they claimed that since they owned the code, others (IBM in particular) had no right to distribute it under the GPL. As is well known, there were two huge holes in their case - first, they didn't own the copyrights, and secondly, they distributed the code under the GPL themselves.

Being sure of the copyright ownership is, as you say, important to avoid this sort of thing. But assigning the copyrights to the FSF doesn't change the legal position - it just makes it a little easier in court since there are fewer people involved. In particular, if I write some code which you take and change the copyright notice, then release as GPL, it doesn't really matter if you transfer copyright to the FSF before or after it is released as GPL. In the first case, I claim you didn't have the right to release it as GPL, in the second case, I claim the FSF doesn't have that right (and also that you didn't have the right to transfer copyright ownership).

The way to protect against this sort of thing is to be very sure of where the code comes from. FSF have always been very careful, as have the Linux kernel developers (except perhaps in the early days). Having everything available in public, along with all the source code repositories, mailing list archives, etc., makes it easy to check the history of the contributions.

Of course, exactly the same thing applies to closed source licenses - it's just that disagreements about ownership are harder to discover (you can't just read your competitor's source code to see if he has copied from you), and often settled behind closed doors.

Also note that once code is released under the GPL, the owner cannot "close the source". If they release version 1.00 under the GPL, they can start using a closed license for 1.01 (or for a re-release of 1.00), but they can never take away the GPL'ed 1.00 from anyone. Others can then freely fork the project and continue it under the GPL (they can't change the license, since they don't own the copyright), with a possible name change to avoid trademark issues. This has happened with many projects.

Reply to
David Brown

I thought that this was just an MS inspired bit of FUD to try and stop Linux?

Now that is logical and the argument many employers could probably use. However as soon as the lawyers get involved it becomes a complete quagmire... a very expensive one at that.

This is probably why it is rarely done. Also to do it against Open Source would probably open up to many other problems.

Agreed.

Quite so. However the problem is in the case I was suggesting is that those actually doing the work may not be able to put it into the public domain or GPL without, in theory, company permission.

It's never happened it says here on this out of court settlement agreement :-)

Quite so otherwise the whole thing becomes unworkable.

So the owner can fork of from V1.0 GPL and make V1.1 closed etc and anyone else can take V1.0 GPL and make an open V1.* GPL of their own which will not be the same as the closed versions..

--
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\
\/\/\/\/\ Chris Hills  Staffs  England     /\/\/\/\/
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/
Reply to
Chris H

I don't think so

The C90 standard was written 20 years ago to meet the needs of computers as they have existed at that time. There is an argument for C90 to support old applications written in the decade between C90 and C99 but the arguments don't apply to new applications written using C99 and the TR's written since its release.

I pointed out that a lot of University research is using as a baseline the GCC tools although there are publications coming from the work the amount novel basic research from this combination is low. At the same time there are serious research topics that have had very little university based research. Instruction sets for machine generated code and single source multiprocessor based applications to name a couple.

All of these will impact the next decade of processor designs.

Regards,

-- Walter Banks Byte Craft Limited

formatting link

Reply to
Walter Banks

... snip ...

Oh yes it does. Each compiler has to be prepared to detect and hack each other compiler, or linker.

--
 [mail]: Chuck F (cbfalconer at maineline dot net) 
 [page]: 
            Try the download section.
Reply to
CBFalconer

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.