What's the story with the "end of XP"?

The embedded world is very big - you specialise in the high reliability and safety-concious areas. For much of the rest of the embedded world, things like price or time-to-market are more important. And what applies to the end products applies to the tools.

Let's take a concrete example. Freescale is one of the biggest suppliers of microcontrollers to the automotive industry, and much of the software on these micros is developed using Metrowerks - a large, expensive (for the full versions), commercial development suite. The full pack for a given target might take around 1 GB disk space on a workstation. There is no way you could possibly convince me that the same level of care and quality control is applied to that 1 GB of development tool software as there is to a 20 K engine controller written using that tool. I certainly would not want a car with an engine controller that hangs or crashes as often as Metrowerks (it's not bad software, but it is not perfect).

It very much makes sense to lower the standards for some components. Higher quality standards means longer development time, more people involved, more skilled developers, and higher costs. You use the appropriate quality standards for the job - in a car, you might spend 50 times as much per line of code in the engine controller as you do in the car radio. In a development suite, you spend much more effort ensuring the correctness of the compiler and the library than you do on the debugger or the project manager.

Some sorts of commercial software have the source available under certain conditions, but not all. For a development tool, it's not uncommon to be able to get access to the library source - but source for the compiler itself is likely to be much harder to get hold of. And there can be all sorts of restrictions involved - an NDA can make live very difficult if you are required to show the source to a third party for verification. By the time you have included requirements of being able to show the source to whoever you want, to modify it as you want, to run it on different targets as you want, that your rights are not time limited and cannot be restricted by the supplier or anyone else, you have come pretty close to requiring an open source license. Obviously there are relatively few times when you absolutely need such freedom, but it is far from inconceivable.

I don't really think you understand what I am saying. If an open source license is an absolute requirement, then any closed source software is useless for the job. You either start with whatever open source software is available, or you start from scratch.

That's because you live in a small part of the embedded world, where development is done with meticulous care because mistakes cost lives - purchase prices, licensing details, etc., are minor details in comparison. You can live with restrictive NDAs because that is part of that sort of development. But there is a wider world of embedded development, and a much wider world of software beyond that. This thread is about XP, so consider embedded windows as an example. It is possible to get the source code, if you are a large enough customer, have enough money, and are willing to put up with the restrictions involved. If our company were looking for an embedded operating system with a gui and the ability to easily connect to other computers, then we could not even consider embedded XP - the source is not available to us, under terms we can work with. It might be that there are alternative commercial choices (say, QNX), or it might mean using an open source solution.

And you think it helps by replying with equally irrational and religious arguments, along the lines of "I know someone once who saw some source code - they spent so long reading it, it cost their company money" ? You'll only encourage more extreme replies. I've tried to convince you, by (hopefully) rational argument, that the best tool for the job can often be open source - if you want to keep harping on about FOSS fanatics, then you can do so without my further involvement.

It's taken a lot of blood, sweat and tears to draw out that admission.

Being commercially successful in the software world is the same as commercial success in any venture - it does not require quality. It requires being more popular than alternatives. If you can do that in other ways than spending money on quality, then you get more profit. In an ideal free market, where people can choose and mix and match, then quality products have a better chance (though they may still lose out to lower priced competition).

That's one of its many failings, not its only one. Windows is certainly usable for many purposes, and can often be the right choice (I am using it now), but its commercial success has very little to do with its technical quality as software.

With 20/20 hindsight, and possibly poor foresight, this company knows it made a bad decision. Making decisions like that are partly skill and partly luck - this is hardly information on which to base your philosophy on open source.

As I have always said, being open source does not necessarily make a given piece of software the best choice. But in this case, you are comparing one piece of software with source to a different piece of software with no source, and leaping to the conclusion that it is the source that was the cause of the problem. It looks to me like the software they choose was not the best for the job - the source code license is pretty irrelevant here.

The only time I have seen wide-ranging claims that all commercial software is bug-ridden is in *your* posts, with your accusations that this is a common attitude of FOSS devotees. Most FOSS devotees I know will happily tell you the benefits of their choice of software - but they believe that *you* should have the freedom to choose what software

*you* want. They'll tell you about bugs in particular closed source software, and they'll tell you why the open source development model can help avoid certain problems. So let's have a little less of this here-say about what you think some fanatics say, thus keeping the conversation more level-headed. If someone posts claims you think are fanatical, respond to *them*, not me.
Reply to
David Brown
Loading thread data ...

This implication is a reach. My experience with volume embedded systems is they would rather spend money on tools and tool support than spend money on production. Support gives them access to the larger body of experience using the particular part. Good tool vendors give them access to effective code generation.

I know you are illustrating a point but the current engine controllers are not using metrowerks tools for code generation. The automotive industry in general use commercial tools from independent tool vendors. Tool companies tied to silicon companies lose the attention to detail that independent companies have and lose access to solution approaches that providing tools to competitive architectures bring to independent tool developers

Walter Banks

-- Byte Craft Limited Tel. (519) 888-6911

formatting link
email snipped-for-privacy@bytecraft.com

Reply to
Walter Banks

First off, I see you aren't disagreeing with me about my clear suggestion above. So I take it, you grant me it. Thanks. It wasn't an accusation of sorts, just my take on why you consistently post the way you do, as though you really do not truly understand what it is to be in business as an applications coder and to have some measure of control over your own business.

Unfortunately, I don't even know what FOSS means. Maybe I could address myself to your suggestion if I did. So if you would, explain your meaning here, as it applies to my comment. As it is, I've no idea what you are trying to imply by it.

Sometimes. A fact about the above comment I made was that I was just a mere programmer at a company and really had no power to ask for the source code, nor truly much reason at the time to be honest. There would have been no way at all that I could have asked my company to engage Digital Equipment about getting the source. Yet, I was skilled at operating system work (I'd developed a timeshared system in the mid

1970's and I'd worked on Unix v6 a little later on) and I was quite capable (and interested) in finding problems in the VMS code, if they should come up. And, in fact, I had two cases come up where I was put in contact with Digital coders who were VERY interested in what I'd found and who put out immediate corrections (next day, for me) to the code. It was a very nice experience to have gone through.

If I were to make a point about this aspect you mention, it is that I believe having source in the public is VERY MUCH better than controlling it through NDA, over the long haul. As I clearly mentioned before, and you chose to ignore, this is kind of like publication and peer-review in science. Part of a generally healthy self-correcting process, over the long haul.

I use all kinds of tools in my work, though. Not just the tools I can get source for. But I can assure you that I feel better knowing the source is out there to look at, so that I have some measure of control over my access to looking for problems when I'm motivated to do that.

An example to highlight the problems comes from a case where I was working on a project using Microsoft's version 5.0 c compiler "back in the day." It had a serious (a bug that could not be worked around) problem where it would crash when faced with a certain small snippet of c code. In this case, it was a very common and inescapable snippet and I was able to reduce it down to just a few lines. Avoiding this particular formulation, if I chose to do so, would require an extensive rewrite of my application, which was already well in progress. I contacted Microsoft about it and finally got confirmation that the bug was a bug about three weeks later. (By that time, I'd already had to spend the effort [which cost me personal effort which I care about and cost the client inescapable calendar time, as well] to work around the difficulty.) When I later talked to Microsoft about when a fix might be coming (I used their compiler elsewhere, of course), I was told something I decided to rephrase to them to make sure I understood... I said, "So you mean you cannot assure me that it will be fixed in the next revision, or any revision ever?" They said, "Yes, that's about it." I then said, "And you cannot even tell me when there will be another revision, not even approximately?" "Yes, that's right," they added.

It's a lifetime of experience that informs me that having the source available in the public, whether proprietary rights are retained or not, that helps me feel more comfortable. If they own the code, that's fine. If I find a specific problem and suggest it to them, I have a better chance imagining that it will be included at a later time. So I simply prefer that situation since it places a little bit of extra control over my own life in my hands.

Who cares about them? If you are so worried about unskilled eyes, I think you have really got a big problem you need to deal with. This is simply crazy sounding, to me, and undermines anything else you might add. Sadly.

So what?

Jon

Reply to
Jonathan Kirwan

And in a Barbie Doll, saving pennies per doll while raising the number of dolls that work incorrectly by 5% is considered a good deal. We shaved five cents off the electronics for one toy by ripping out the error correction code ror in the IR link and thus being able to use the next-smaller RAM size; 128 bytes instead of 256 bytes.

--
Guy Macon
Reply to
Guy Macon

You make your Barbie dolls work? Isn't that illegal toy exploitation?

It's called "value engineering." I had to go to a seminar once.

The feelings of the poor little girl whose Barbie doll quits working don't enter into the calculation. She'll probably become frustrated and abuse her kitten, and now you've added animal cruelty to the list of crimes.

--
Al Balmer
Sun City, AZ
Reply to
Al Balmer

... snip (large) ....

And this one paragraph really covers the subject.

--
 
 
 
                        cbfalconer at maineline dot net
Reply to
CBFalconer

Ugh. Sounds like a horrible environment. I am guessing that you are glad to be out of there.

--
 
 
 
                        cbfalconer at maineline dot net
Reply to
CBFalconer

For high volume products, it is certainly worth spending money on getting the best tools for development - your development costs are spread over a large number of units. But the situation is not quite the same as for safety-critical work - it's a different sort of "best". For high volume consumer development, your priorities are tools that let you work quickly, and that generate small and fast code (equating to cheaper and slower micros in the end product). For safety-critical work, your priority is your certainty that the compiler (and libraries) generate correct working code. The same compiler suite may be "best" in both these categories, but not necessarily.

I am not arguing against spending money on good tools when appropriate, or that good tools are not available - I am only trying to explain that there is no general rule that "closed source is good quality, open source is poor quality" as Chris seems to believe, and, further, that top quality is *not* always something to strive after.

In the world of embedded development tools, especially at the higher end (be it for volume or for safety), you get much closer to "you get what you pay for" than in many other markets, and certainly much closer than in most software areas, because there is a healthy competition and customers emphasis the need for technical quality. But it is *not* the case in much of the software world. I've seen people who use large, expensive, commercial embedded development tools that they hate - but they can't change tools without a large cost, because all the libraries, linker setups, etc., are incompatible with other tool suites for the same target. Here, price can be a rough indicator of quality, but is far from a guarantee.

As you say, it was only an example - the details are fairly irrelevant. But perhaps you, as a developer of high quality development tools, can answer a simple question - do you take equal care, and go through as many code reviews and automated testing procedures, with all parts of your development tools? I suspect you would be rather more worried if a bug caused your code generator produced incorrect code than, say, a spelling mistake in your help files. My point is, code standards and code quality cost time and money (they also cost if they are too low, of course), and must therefore suit the task in hand.

mvh.,

David

Reply to
David Brown

The difference that I have seen between generic high volume embedded systems and safety critical work is in the testing. In the safety critical work product testing is a formal discipline. Code generation tool support requires product changes be documented both to the change, the root cause and the impact of the change including side effects on products created using the tools. We work very closely with our customers to achieve comfortable working relationship to exchange critical information.

Fully open source (as opposed to the availability of source licenses) would do very little to help customers, their expertise is in the application and our expertise is in the tools both their current state and the history of changes and relationship to generation with a particular piece of silicon. The job of good tools is to generate code that will run on the target silicon including accounting for silicon product defects. Fully open sources rarely help the customer solve tool problems, just look at the problem understanding the inner workings of a relatively simple compiler like GCC and relating that information to application implementation on a specific piece of silicon.

Related to the general argument of open source vs commercial tools I am seeing many of the open source tools losing ground against their commercial counterparts. GCC for example still has very little better than

20 year old code generation technology producing okay but not spectacular code. There is a level of detail that is missing, most GCC compilers have been implemented for families of ISA's and not individual members. GCC based compiles have done very badly when used for processors with small or low number register sets, RISC instruction sets and multiprocessor single source applications.

The commercial companies have a critical mass of personnel supporting the silicon that our customers are using and very often tool developers are part of the support structure. Commercial companies have been stereotyped as expensive and in-effective, something that just doesn't stand up to close examination.

The interface to most commercial tools libraries are usually well documented. Most if not all commercial support library sources are either provided (Byte Craft ships sources to our support libraries) or are available as separate source licences. I don't actually believe that price is an indicator of quality there is a relationship. High volume customers usually use benchmarks to measure quality something we encourage.

Our fundamental product is code generation, IDE's are shipped as part of our products so our customers have a full solution. BCLide goes through regular product review as does documentation and FAQ's. Application requirements evolve as does API's both get a lot of attention. This is the business we are in.

I think you will find that we do what is appropriate for each of the parts of our tools set. Our (Byte Craft ) tools conform to industrial standards of interface and status reporting so they will correctly function with other tool sets.

Walter Banks

-- Byte Craft Limited Tel. (519) 888-6911

formatting link
email snipped-for-privacy@bytecraft.com

Reply to
Walter Banks

Walter, let me stop you here. I'm actually interested in your comments on safety critical work -- more particularly, from my exposure with medical devices.

In my modest experience in this area, I need to test every code edge. This includes all code edges found in any operating system that may be included. It does NOT include those in the compiler, obviously, as that isn't in the runtime object. But it would include all the libraries that are linked.

I agree that a compiler being open source would not be a necessary help here, except perhaps where the compiler generates its own code edges as it compiles. But I have seen cases where a compiler does generate many of them, internally, and where they aren't visible unless you examine the assembly output. One that comes to mind was about some special handling for selector/segment register use with __huge pointers. Does that kind of thing come up in your discussions regarding safety critical support?

Jon

Reply to
Jonathan Kirwan

I suspect that what you have seen is heavily influenced by the fact that you work for Byte Craft and thus mostly see high volume embedded systems and safety critical systems using Byte Craft tools. your products are *much* higher quality than the tools I used while creating high volume embedded systems for Mattel. The tools used to develop for EMC and SunPlus processors (or whatever is cheapest this year) are very crude indeed.

Other than that, I pretty much agree with your comments. There are some really fine commercial embedded development tools out there. The volume of users and developers really starts to favor OSS when the target system is a PC and *especially* when the comparison is to a Microsoft product. No amount of testing will bring a system written in Microsoft C++ on (closed) embedded Windows up the the quality level of the same system written using (open) GCC on Linux, but what happens when you compare GCC on Linux with Intel C++ on (closed) QNX Neutrino?

--
Guy Macon
Reply to
Guy Macon

And I offer up this one (not Vista-specific):

formatting link

;)

Steve

formatting link

Reply to
Steve at fivetrees

Elsewhere in the last post I mentioned that Byte Craft ships sources to our support libraries so our customers have full sources for their products. This allows them to do appropriate testing necessary for their products.

Surprisingly not very often. What generally comes up are functional questions about how a given compiler will process a given input. Most developers are very conservative creating code where safety is an issue.

One other testing comment is we have the ability to bring out of the compiler the node information for every decision point in an application (McCabe) some of our customers use this information for code coverage and insitu regression testing.

Regards

Walter Banks

-- Byte Craft Limited Tel. (519) 888-6911

formatting link
email snipped-for-privacy@bytecraft.com

Reply to
Walter Banks

... snip ...

I fail to see how a compiler writer can possibly tell when the compiler is generating 'safety critical' code. A statement such as "if (--a) ...." could be crucial. Or not.

--
 
 
 
                        cbfalconer at maineline dot net
Reply to
CBFalconer

The compiler writer does not know which is why subsets are used for safety critical work.

Also you run full tests on the compiler such as Plum-Hall and Perennial

--
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\
\/\/\/\/\ Chris Hills  Staffs  England     /\/\/\/\/
/\/\/ chris@phaedsys.org      www.phaedsys.org \/\/\
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/
Reply to
Chris Hills

For some situations, being open source provides great benefits to the customer, and (if your business model is appropriate) to the vendor. But in other situations, it would not help. In the case of ByteCraft compilers, you are truly expert in your field, you work closely and rapidly with customers if there are any issues, and you have top class testing and quality control. There are very few users of your compilers who could improve on your tools if they had the source, and even fewer who could do so more cost-effectively than you.

But there are many other tools in the embedded development industry which do not match that quality. Clearly, there is a difference between code running on the host (like the compiler) and code running on the target (libraries, OS's, etc.) - for code running on the target, source code access (open or not) is obviously a benefit to end users, while on the host, code access is less useful. At the higher end of the market, there is less use in code access - the vendor makes better software in the first place, and the crucial code is often beyond the understanding of non-experts. But at the lower end, which covers a wide range of users, this is not necessarily true. I have certainly used commercial closed source development tools that I am confident I could have improved upon if I had the source, and would have been able to share those changes with other like-minded users and the original supplier, resulting in a better tool. (Whether I would have the time to do this is, of course, another matter!)

The point is, while open source offers little to the vendors or customers of good quality tools, it offers benefits for other kinds of programs.

There is also nothing in this to say that open source software cannot also be of high quality (though economics usually limit things like verification and certification).

gcc is a different sort of compiler from ByteCraft's - you can see there is very little (if any) overlap in the processor targets they support. As you say, gcc is aimed at larger processors, with multiple registers, a single memory space, and preferably at least 32-bit registers (though

16-bit is fine, and the 8-bit AVR port is not bad). ByteCraft, on the other hand, deals with optimising code for 8-bit micros with specialised registers, multiple address spaces, and other C-unfriendly aspects. Your compilers can happily spend significant host time getting the best code possible out of perhaps 10,000 lines of C code - gcc has to be able to chew through programs of millions of lines in a reasonable time. For your compilers, knowledge of the individual family members can be important (for example, when you know the size of a chip's flash, you can tell whether the code should be optimised for speed or size), while for gcc, it is not relevant which, say, MCF52xx chip you have, since they all have the same core. (Having said that, for many gcc ports, you *can* specify exactly which microcontroller you have - it makes it easier to get the right header and library files, and saves the user a little thought.)

So there is no way gcc can lose ground to ByteCraft, or vice versa - the targets are completely different. It might be more relevant to compare ByteCraft to sdcc (which is not related to gcc, but is open source - and by all accounts, it works well enough but can't compare in code quality to the good commercial 8051 compilers), or to compare gcc to Green Hills. I'm afraid I don't have any clue as to the numbers here, and who might be "winning" or "losing".

One place where gcc clearly is "winning" is in support for new architectures. There was a time when a company designing a new processor would look to one of the large commercial compiler developers for compiler tools. Now, at least in the 32-bit arena, they are looking at gcc - making or supporting a gcc port gives enormous value for money for the processor vendor. Atmel, Xilinx, Altera, Microchip, and many others rely on gcc as their initial compiler support for modern cores. That is not to say they don't support commercial compiler developers as well, but gcc gets them started faster and cheaper.

These final paragraphs confirms much of what I have been trying to say to Chris Hills in this thread - you put your greatest efforts into the part of the toolsuite that you are experts in, and that is most important to the end user (code generation), and you put appropriate effort into all the individual parts of your products so that the end result has the quality you need.

mvh.,

David

Reply to
David Brown

^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

This has nothing to do with my question. Read the underlined sentence by Walter Banks.

--
 
 
 
                        cbfalconer at maineline dot net
Reply to
CBFalconer

you run full tests on the compiler such as Plum-Hall and Perennial

--
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\
\/\/\/\/\ Chris Hills  Staffs  England     /\/\/\/\/
/\/\/ chris@phaedsys.org      www.phaedsys.org \/\/\
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/
Reply to
Chris Hills

Jon was asking specifically about my conversations with folks doing safety critical code. Customers use benchmarks and test suites to characterize the compiler that they intend to use. We often get involved in the benchmarking process which is the root of my comment.

Compiler companies are very much aware of the intended application and potential customer base for a specific processor and that impacts the approach used in implementing a compiler for that processor. In some applications area's we have extensive meetings with the ultimate users as part of design process while planning a compiler implementation.

Regards

Walter Banks

-- Byte Craft Limited Tel. (519) 888-6911

formatting link
email snipped-for-privacy@bytecraft.com

Reply to
Walter Banks

Still no connection with my question. However, Walter has answered himself, and explained what he meant.

--
 
 
 
                        cbfalconer at maineline dot net
Reply to
CBFalconer

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.