cool article, interesting quote

Hand a psychopath a gun and you get dead people. Hand a psychopath with solid supervision a gun and you will get minimal deaths.

Hand a bad programmer a mission critical system with no supervision or testing and you get dead people......

Reply to
The Real Andy
Loading thread data ...

In article , The Real Andy writes

Hand a trained marksman a gun and you only get bullets in authorised targets....

Hand a trained SW engineer the spec for a critical system.....

--
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\
\/\/\/\/\ Chris Hills  Staffs  England     /\/\/\/\/
 Click to see the full signature
Reply to
Chris Hills

malloc(), printf() - two great examples of abstraction

Reply to
The Real Andy

Hand a skilled, experienced SW engineer the spec for a critical system..

It obviously works, cause there is plenty of mision critical systems out there that have kept people alive...

Reply to
The Real Andy

"Genome" schreef in bericht news:LgQ0g.26910$ snipped-for-privacy@newsfe5-gui.ntli.net...

It's like soup. You can go into your garden and try to find all the ingredients needed/wanted and probably make some soup. This was how it was done when soup was just invented.

You can still use the old methods to make soup. Or go to the shop and get a can of soup. The good news is that doing so leaves you more time to prepare the main course.

--
Thanks, Frank.
(remove 'q' and '.invalid' when replying by email)
Reply to
Frank Bemelman

It also has way too much salt in the can. :(

--
Service to my country? Been there, Done that, and I've got my DD214 to
prove it.
 Click to see the full signature
Reply to
Michael A. Terrell

I agree that cost does not necessarily correlate with quality. The focus on low cost development tools prevents a lot of potentially important research from being undertaken to create powerful development tools.

As a small example of this issue, linking editors were developed 35 years ago to address the lack of ram space for symbol tables needed to create big applications. Here we are with personal computers with Gigabyte's of ram and schools are still teaching the "correct" use of link editors inherent in the low or zero cost tools they use to teach programming. This fundamental property of many tools sets prevents RISC architecture processors from achieving anywhere near the potential efficiency that can be achieved with optimization algorithms that know the data and control structure of the entire application.

w..

przemek klosowski wrote:

Reply to
Walter Banks

I have witnessed this kind of thing in the scientific sector every time I looked into it (and I did have more than one chance). The truth is, people there just don't need to produce anything which works, you won't believe what kind of illiteracy is to be encountered throughout most of the equipment which gets bought (typically things are even nastier, they just "buy" from the right "vendor" who'll pay for "consulting" etc., and ignore all the obvious working options...).

Well, I don't have much insight there, but I think your example demonstrates how motivationg the necesity to produce something which really works can be.... :-)

On to the lazyness/undesrstading the problem in its entirety/tools/ etc. thing from other messages in that thread.

I do use only tools I have written - assembler, linker (yep, it can be useful if you have made it the way you need it), debuggers etc. but this is not the point. Those who do not want to understand the difficult part do so typically not because of lazyness as they tend to spend a lot of time working, it is more like ineptitude.

Unlike you state, we always put a level of abstraction somewhere, using everyday language itself is doing so. The question is where do we put the abstraction, i.e. which part of the project we prefer to treat only as a "black box". For example, when doing some analog design, we typically think of an opamp without digging into its internal schematics - thus we do use an abstraction. It is important to be able to dig further, though, otherwise the black box becomes a magic box... The moment the abstraction level is far enough so the box becomes magic, this is no longer engineering, it is more like pop-science or office system utilization or whatever.

Dimiter

------------------------------------------------------ Dimiter Popoff Transgalactic Instruments

formatting link

------------------------------------------------------

John Lark> >

Reply to
Didi

In article , Keith wrote: [....]

Yes isn't that when the compiler gets really mad and prints a lot of error messages?

--
--
kensmith@rahul.net   forging knowledge
Reply to
Ken Smith

In article , Michael A. Terrell wrote: [... soup ...]

Because I can't have MSG, I often buy the low sodium type. It costs way too much but its ok if you add a little salt.

--
--
kensmith@rahul.net   forging knowledge
Reply to
Ken Smith

I have high blood pressure and diabetes so I have to avoid excessive salt and sugar, so I make my own soup. I have the big crock pot full of vegetable beef soup right now.

--
Service to my country? Been there, Done that, and I've got my DD214 to
prove it.
 Click to see the full signature
Reply to
Michael A. Terrell

BTDTGT

...

The analogies I use here is what level of abstraction you need is basic understanding of operation, not how it was crafted (always sometimes helps).

For example a level of abstraction is a desk, however you do not say a desk drawer is an abstract object that contains items of any sort, you realise the limits of its operation and materials. That is you don't try to fit a car into a desk drawer for two parts of understanding about a desk drawer, basically it is too large and too heavy for the size of drawer and the materials strength of the drawer. Some people no doubt do try this, but Darwinism does not mean the most intelligent survive.

The other one is a car going uphill, most people (note use of the word MOST) realise if you go uphill in a car you often have to use more power, especially if the car is more havily laden than normal. You have to have a basic understanding (even if not realised directly) of physics and the operations of a car as you may need to choose a lower gear to get uphill. You don't necessarily need to know how many teeth each gear has or what materials the car is made of or how made. However there will always be the petrol heads who know this and more, even probably who was working on the shift that cast the engine block for that car.

-- Paul Carpenter | snipped-for-privacy@pcserviceselectronics.co.uk PC Services GNU H8 & mailing list info For those web sites you hate

Reply to
Paul Carpenter

What I refer to is not whether you use an abstraction or not, nor do I suggest it is OK to use an abstracton only after you are familiar in detail with the abstracted entity. The line I draw is between whether you could go down the details if you want to or you could not. There can be numerous reasons for the latter, like lack of information, mental capacity etc. What I am saying is that using an abstraction by someone which he _cannot_ resolve to details within some reasonable time is quite allright, only it is not engineering. Typically throughout a project we do use abstractions and more often than not we are taken by surprise and do have to go down some unexpected details until we discover the problem, this is a major part of the design & prototyping process.

Dimiter

------------------------------------------------------ Dimiter Popoff Transgalactic Instruments

formatting link

------------------------------------------------------

Paul Carpenter wrote:

Reply to
Didi

This may be true. But I can also recall the times when there were no free tools and the compiler software cost a lot of money up front and a lot more money every year in "maintenance" fees. Even for all of that, I didn't find many of the basic compiler ideas which had been known about then for a decade in research papers and expanded upon quite publicly by a number of popular compiler books in the late

1970's and early 1980's finding their way into actual compiler tools.

Even with the money flowing, I'm not convinced it goes into practical compiler optimizations. It seems to go into support salaries or profits or something else.

I guess this goes together well with your agreement that cost does not necessarily correlate with quality. The point being that just paying more won't necessarily fix any of the problems in fielding practical compilers which will be popularly used and include the better of current optimization technology. GNU is one approach to encouraging technological development, but it has it's own non-fatal downsides.

One of the things that MIPS understood, but not well enough I think, is that their chips would press hard on compiler technologies. And fielding practical tools would be required if they were to hope to compete well against the big (at the time) CISC FABs. These large companies, like Intel and Motorola, certainly would not sell access to the better FABs which were producing their flagship chips and MIPS would only be able to use lower technology, initially, so they would have to do more with less. It was the belief of the founders and their investors that the technical advantages lay so firmly on their side that they could still do it. But it would require practical compiler technology. They knew that much, clearly.

But that's a side-bar. Frankly, I'm cynical to the point of pessimism regarding getting even the already existing good ideas on optimization into the hands of application developers. All my working life I've read some of the research papers as well as not-so-popular books on the subject of compilers (being a tinkerer and having had a few occasions to write my own assemblers, linkers, BASIC interpreter, as well as C compiler -- none famous or thoroughly vetted professional products, though one or two got exposure to hundreds of users.) I've known generally about method after method which continually fail to reach the market. And in the few cases where they do, they do not become widely recognized and used.

Application developers are like most consumers, I think. In general, having very little real knowledge of what makes for a quality product. Lacking that detailed knowledge, they do what comes next best -- they look at price and external features that reduce their development time. As a result, any superior (and expensive) quality that goes into making the core compiler just doesn't make it onto their radar screen. They look at features such as slick check-off boxes that produce a default main() function which pre-initializes various registers for them or other "wizards" to help get an application quickly closer to the end point. Or else they look at "sizzle" features, such as docking toolbars and colored keywords and font support and so on. Especially in the case of fancy IDE systems, it takes a lot of compiler tool vendor programming time to get these done correctly and well. Print preview alone, if supported, can be a veritable headache in having to deal with different contexts simultaneously (printer context and screen context.) These things suck away the oxygen from the air that might have otherwise given life to a better core compiler and linker toolset.

And I'm not sure that higher prices would help. There would still be a market-driven craze for fancy GUI features and a race for creating new features that will be added to magazine articles where their product gets a check-off and the competition doesn't even have that feature; or else where they get their own product to include check-offs that other products also have.

I agree that low cost tools means less oxygen in the air. And that less oxygen means less time available. But having more oxygen injected doesn't mean the research efforts get funded or the core technology gets advanced. Since consumers (application developers) aren't generally knowledgeable about DAGs and basic blocks or trace scheduling or code edges or ...., let alone familiar with examining the output of compilers to compare and analyze the results, they are not in a very good position to know compiler quality when it is there. So they don't pay for it, if having it costs them features they can see well -- such as docking toolbars and print preview and colored keywords and the like. What will happen is that the vendor will use the additional horsepower to instead leverage themselves even further along the lines that their customers _do_ notice and therefore pay for.

So, how do we get compiler research funded? Through the universities, such as CMU, I suppose. And through the one or two large companies which actually have a genuine commitment to doing research, such as Bell Labs. How do we get these ideas turned into practical products which people will actually use? I just don't know.

But I'm still using tools today sporting only the basic features of compilers and optimizations which have been known about and researched since before I started in computer software in 1972.

I'm guessing that most of the natural channels for the flow of money just don't lead in that direction.

What I look at, as a consumer when presented with several choices given other restrictions, is what the compiler toolset provides me without the IDE, without the toolbars, without the wizards. I focus on the command line compiler, the command line assembler, and the command line linker and look at what is provided by these three core tools. I also examine the compiler output for a few routines I have to test such things. If others focused their attentions in similar fashion, the natural landscape's gravity would push tool vendors in a better direction. But that's not likely. Hence, my pessimism.

Oh, well. I probably got some things above wrong (for example, I'm completely ignoring patent issues which may be real barriers for tool vendors) and I wouldn't mind being corrected by those in the tool business (whom I _do_ generally respect a lot) and anyone else with varying perspectives.

Jon

Reply to
Jonathan Kirwan

Abstraction of tools is, in my opinion, unwise and boring to boot. But abstraction of the process itself, being unwilling to learn the details of the hardware, the science, the application, and the business aspects of the thing you're programming, could be fatal to one's career.

If all you do is code rigidly to a requirements document that someone else wrote, you can get clobbered from below and above: Below, coders in Russia and India and China can do the work for a fraction of the price. Above, it's just a matter of time until a UML compiler, or some LabView-ish sort of thing, wipes out the entire "coding" layer.

Coding will, I think, eventually go the way of blacksmithing.

John

Reply to
John Larkin

English is too limited and has always has historically been supplemented with state diagrams, tables, graphs, block diagrams, flow charts, and equations in technical documents. I would think the ideal "language" would be able to interpret this type of information directly (or at least somewhat directly, with the possibility that some contraints are allowed in the format to allow the information to be processed more easily by the language).

well I would say as soon as we can use everyday technical documents to explain to the machine what we want, there is nothing revolutionary with this idea, just evolutionary( machine language=>assembly=>

FORTRAN=>Simulink etc, it's going in the right direction)

Reply to
steve

You mean like well-written C?

;)

Steve

formatting link

Reply to
Steve at fivetrees

I'm a bit stunned by the idea that abstraction is not a good thing. Anytime one uses a resistor, a capacitor, a transistor, a 74HC IC, or a subassembly with a well-defined interface, one is using abstraction. I don't know anyone who rolls his/her own capacitors. (And yes, one does have to understand a little about the different characteristics of capacitors, or the difference between 74HC and 4000-series parts...)

I suspect the real issue is how much one can *trust* the abstractions one is dealing with. I've had the experience of buying in 3rd-party code libraries, only to find that, under the hood, they were generally dreadful. But it shouldn't be this way. Most code I come across (whether C++, C, or assembler) is very poor indeed. I sometimes think there should be a driving test for coders... except I suspect there'd be a 95% failure rate.

Earlier in this thread there was discussion about well-written C++ libraries. I'm not a fan of C++ (as I've said ad nauseam in other threads), but the principle is correct, and is equally applicable to *any* language, including C (I *am* a big fan of OO in C). The point is *decomposition* - breaking complex problems down into trivially-simple modules, with simple and clear interfaces - as one does in EE. I've never fully understood why this principle is so well established in hardware, and apparently so hard in software. My own observation, over nearly 3 decades as an EE specialising in firmware, is that decomposition is something that s/w people are not generally very good at. I find this strange.

I get mad when I see the age-old myth that "complexity means more bugs". This is to me an admission of failure to decompose, and to abstract. I break such problems down, over several layers, such that what I'm dealing with is effectively a collection of abstractions, with clearly-defined interfaces and no side-effects. I can concentrate on each aspect separately, without encumbering my limited brain with all the details of all the elements throughout the entire product. If I had to design complex hardware while thinking about all the guts of every component at the same time, I'd be in deep trouble. And yet, this is precisely what happens too often with s/w.

Yeah, yeah. I've seen this argument many times, and I'm not holding my breath. To take your analogy, let's assume that blacksmithing has been replaced by car manufacture. Consider how a car is designed and produced. Consider the decomposition and indeed abstraction involved (e.g. the guy who designs the shock absorbers doesn't need to know too much about the guts of the air conditioning system). Something to be learned there.

Final thought: years back, I was very much in a minority as one of the first practioners of top-down structured design in an R&D dept full of spaghetti-coders. Now, we pay lip-service to the idea that this is standard stuff - yet we still fail to separate design from coding. If we could indeed trust s/w modules or libraries to do what they're supposed to do, with no side-effects, I suspect we'd have made better progress. Instead we re-invent the wheel with every new project.

Steve

formatting link

Reply to
Steve at fivetrees

don't know, I personally never seen any of this mythical well written C code that people always refer too, it seems to only exist in the minds of the authors for a small period of time immediately after it was written :) all the C compliers I have used don't handle state diagrams, tables, graphs, block diagrams, flow charts, and equations without massive twisted reformatting. I have seen excellent C source files, but that's only because the code was preceded with comments or references which included state diagrams, tables, graphs, block diagrams, flow charts, and symbolic equations.

Reply to
steve

But I understand the physics, the thermal behavior, and the subtleties of the parts, and I don't hide from them as a matter of preference. Taking something familiar for granted is not abstraction.

I do. Resistors and inductors, too. I've sold megabucks worth of NMR gradient drivers, based partly on making my own ultrastable current shunts.

Exactly. "Most code I come across (whether C++, C, or assembler) is very poor indeed" tells us that it is only the few exceptional programmers who can use the existing paradigms and practices of programming with any real success.

It's no myth. If a program is twenty times more complex than it needs to be to do the job, it will have 400x as many bugs. And lots of programmers make things this complex because a) they don't know any better and b) they enjoy it.

Or a failure to find a clean, simple, flat structure and implement it.

I had a seminal experience when I was young: I hired a guy to write a fairly simple program to track some parts usages. He said "If I write a general-putpose database manager, your problem will become so trivial I can do it in a day." He was older and bigger and "a programmer" so I let him do it. Six months later we had nothing useful, and he wandered off. The didn't *care* about my problem, he wanted to play with database managers and cool linked pointers and stuff.

I've seen two recent cases, in two different companies I worked with, where a programming staff was hired to develop company-critical software to analyze scientific data for analytical instruments. Neither group talked much about the problem... they mostly raved about what cool things you could do with Java. One group, working for a 3F financed startup, only burned about 3 programmer-years, and the bigger group burned over 75. Neither produced anything useful.

Cat's on my keyboard, coffee water is boiling, rant suspended.

John

Reply to
John Larkin

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.