Re: Intel details future Larrabee graphics chip

The original thread was "crossposted" to rec.video.desktop and I took a position similar to what you have here. (So the "reversed battery" was a bad memory stick, I never saw a post from the OP of that thread as to the actual source of the problems.)

A current opportunity that may illustrate your point:

"Caligari is releasing the latest version of trueSpace for free, and including all their training courses for free as well.

As someone who has spent hundreds on their earlier versions and tutorial/training packages, I view this with mixed feelings, but it is a great opportunity for anyone looking for a full featured 3D Authoring package.

With the training courses ( which you really need ) thrown in, it amounts to a free college semester in Graphic design or one of those "Learn how to create Games" courses."

formatting link

It is not easy to jump right into such programs and learning to do the simplest things in such an environment will take some study (in this case there are plenty of tutorials available, though); but that doesn't make it "bad software design".

Luck; Ken

Reply to
Ken Maltby
Loading thread data ...

Actually, despite this, hardware is full of bugs as well. It's just that most people don't notice them. A lot of work in the BIOS and OSes goes into making such bugs invisible to the average user. Hardware typically has a long testing cycle with many revisions and bugfixes before release. This catches most of the obvious issues, but hardware is by no means bug free. A typical CPU has 10-100 known bugs (many of which are worked around in software) and probably a similar amount of unknown ones. If things go wrong, people usually blame the software...

Also hardware solves a well defined problem with a limited set of inputs and outputs that is invariably well specified and not subject to changing requirements. Given such a well specified problem it is actually easier to implement it in software with 100% correctness. Consider floating point emulation in software as an example of such a problem.

The problems software solves are usually larger and more complex, often without a detailed specification and, worse, constantly changing requirements. Such problems are not even considered for hardware implementation, but it is the reality for a lot of software.

Wilco

Reply to
Wilco Dijkstra

designed. Modern compilers these days have a

problem of below average coders and poor

live with that. And it isn't strictly the

written in C.

Exactly. A poor programmer is going to be a poor programmer no matter what language they use. It's always fun to see how people believe that so called "safe" languages are really a lot "safer". The bugs just move elsewhere. For example garbage collection solves various pointer problems that inexperienced programmers make. However it creates a whole new set of problems. Or the runtime system or libraries are much bigger and so contain their own set of bugs on each platform etc.

intrinsically better suited for robust

heavy for my taste.

I wouldn't mention Pascal as the standard version doesn't even support modules etc. Newer variants like Oberon are better in that respect. I agree on Ada. None of the existing languages are perfect, they all have advantages and disadvantages. As long as there is no believable replacement for C/C++, there is little incentive to switch.

Wilco

Reply to
Wilco Dijkstra

In article , "Wilco Dijkstra" writes: |>

|> > At last something that we can agree on. Too much software is not properly designed. Modern compilers these days have a |> > fair amount of static testing built in, but nothing like enough to solve the problem of below average coders and poor |> > or no specifications or design documents. |> >

|> > It will draw fire from the C is always wonderful crowd but I expect you can live with that. And it isn't strictly the |> > C language that is at fault. There is plenty of good well documented software written in C. |> |> Exactly. A poor programmer is going to be a poor programmer no matter |> what language they use. It's always fun to see how people believe that |> so called "safe" languages are really a lot "safer". The bugs just move |> elsewhere. For example garbage collection solves various pointer problems |> that inexperienced programmers make. However it creates a whole new |> set of problems. Or the runtime system or libraries are much bigger and so |> contain their own set of bugs on each platform etc.

There are at least two major classes of bug where it is false, and C does especially badly:

1) Bugs introduced by an inconsistent, ambiguous or unreasonable standard. 2) Bugs which are provably easy to detect (e.g. array bound or arithmetic overflow).

Regards, Nick Maclaren.

Reply to
Nick Maclaren

designed. Modern compilers these days

the problem of below average coders and

can live with that. And it isn't strictly

software written in C.

It's certainly true the C standard is one of the worst specified. However most compiler writers agree about the major omissions and platforms have ABIs that specify everything else needed for binary compatibility (that includes features like volatile, bitfield details etc). So things are not as bad in reality.

That is an argument about what exactly should be part of a language and what should be part of the libraries. Automatic strings, checked arrays and overflow arithmetic are all things that are/can be supported easily in C/C++. There are certainly advantages of having this built into the language, however there are drawbacks too.

Wilco

Reply to
Wilco Dijkstra

Digital logic designers tend to think in terms of concurrent, synchronous state machines that operate on data. Programmers tend to think in terms of sequential processes that have zillions of states, a minute fraction of which they ever really consider. Ignoring data for a moment, think of a sequential program state as being its program counter, and the state transitions being the drunkard's walk through all the possible paths between branches. Now add the data back in. Now connect it to a bunch of other "state machines" coded by other programmers, with badly documented or undocumented interfaces, and next allow a lot of asynchronous events.

I read that for a major bunch of Windows APIs, the only documantation was the source code itself.

C is a bad language that has spawned a worse culture.

I see too many programmers who are there for the process, the mental game of programming, and don't really give a rat's ass about the end product, the user interaction. I see a lot of borderline autism among programmers, as if they can't envision being *outside* the code, looking in, as a user sees it. It's like a failure of empathy.

John

Reply to
John Larkin

MHz,

hardware shift register,

there is a floppy controller

acceleration.

set,

you

processor

Between that and the tricks you can do with FPGAs, most certainly.

Reply to
JosephKK

Sounds similar to APL.

Reply to
JosephKK

Only for fairly small numbers of records, not more than a few thousand. Try it with a modest database with say several million records. Big databases have several relational files with over a billion records. Data warehouses hit trillions of records in high tens of thousands of files and more. PowerBasic simply will not scale that high, nor will linear searches.

On the other hand, for a few hundred records i have pressed word processing programs and spreadsheets into service and made them work nicely.

Reply to
JosephKK

You, my friend, have sick mind.

OTOH, the first macro language I came up with, for a terminal emulator/file transfer program, was only slightly more readable. :-(

Terje

--
- 
"almost all programming can be viewed as an exercise in caching"
Reply to
Terje Mathisen

The next stage is to run a dynamic testing suite as a wrapper around the program:

I used to use BoundsChecker which knows about most of the Win32 APIs, it was amazing how many bugs, errors, and just bogus api usage it would catch, even when running against very expensive, but quite simple applications like accounting packages. :-(

Terje

--
- 
"almost all programming can be viewed as an exercise in caching"
Reply to
Terje Mathisen

live with that. And it isn't strictly the

written in C.

The most common serious problem I've seen with programs written in GC languages is when they try to do any kind of external communication:

I.e. using Oracle database handles as if they were an infinite resource which the GC will always be able to clean up is a very good way to crash all the web front ends that share a common back-end DB.

Terje

--
- 
"almost all programming can be viewed as an exercise in caching"
Reply to
Terje Mathisen

Ouch.

[...]
Reply to
Chris M. Thomasson

In article , "Wilco Dijkstra" writes: |> |> It's certainly true the C standard is one of the worst specified. However most |> compiler writers agree about the major omissions and platforms have ABIs that |> specify everything else needed for binary compatibility (that includes features |> like volatile, bitfield details etc). So things are not as bad in reality.

Er, no. I have a LOT of experience with serious code porting, and am used as an expert of last resort. Most niches have their own interpretations of C, but none of them use the same ones, and only programmers with a very wide experience can write portable code.

Note that any application that relies on ghastly kludges like autoconfigure is not portable, not even remotely. And the ways in which that horror is used (and often comments in its input) shows just how bad the C 'standard' is. Very often, 75% of that is to bypass deficiencies in the C standard.

A simple question: have you ever ported a significant amount of code (say, > 250,000 lines in > 10 independent programs written by people you have no contact with) to a system with a conforming C system, based on different concepts to anything the authors were familiar with? I have.

Regards, Nick Maclaren.

Reply to
Nick Maclaren

I thought GC was the silver-bullet of memory management!

lol.

Reply to
Chris M. Thomasson

I hope you get paid well! Seriously.

Reply to
Chris M. Thomasson

In article , Chris M. Thomasson" writes: |> |> > Er, no. I have a LOT of experience with serious code porting, and |> > am used as an expert of last resort. |> |> I hope you get paid well! Seriously.

Not in this market-driven world of ours! Very, very, few technical people do - at least compared with comparable people in marketing or the adminisphere.

Regards, Nick Maclaren.

Reply to
Nick Maclaren

designed. Modern compilers these days have a

problem of below average coders and poor

live with that. And it isn't strictly the

written in C.

Not quite. I doubt if a poor programmer could ever get a program to compile with an Ada compiler. Pascal or Modula2 would protect the world from a lot of the pointer casting disasters that C encourages.

Even experienced programmers can make bad errors with pointers. You could make a fairly strong case for only having arrays.

Pointers are the programming equivalent of having a rats nest of bare wires randomly soldered to points on your circuit board with the other end hanging in the air waiting to touch something vital.

Similar problems for global variables and call by reference (for efficiency) in languages which do not support a readonly contract inside the routine.

intrinsically better suited for robust

heavy for my taste.

Yes. I was a great fan of Modula2 which at one time even had a full hardware implementation in Lilith at ETH. One of the first circuit design programs ran on it. It still has its adherents in some safety critical applications, but it never quite became mainstream. It almost made it as a clean minimalist teaching language for a while. It has a very modular approach to the world that hardware engineers would like.

So true. Increasingly some types of program are being written now by metaprogramming at a still higher symbolic level and then outputing source code targetted at a particular language and/or CPU platform. FFTW uses this method to generate the prime radix FFTs codelets.

Regards, Martin Brown

** Posted from
formatting link
**
Reply to
Martin Brown

I am serious about the bug rates with VHDL vs Verilog tools though - that would be an interesting test of whether strong typing affects defect rate in a non software engineering context.

Only bad programmers. And I don't think it is quite as simple as you seem to think. Electronics engineers tend to have experience of programming projects that are a few man months. And you can get away with murder on that sort of scale if you are any good at all.

CCI is a very good code metric for where to look for bugs.

Things get really hairy on big projects using many man years in part because process control for software development even in places like Microsoft are not as good as they should be. Steve McConnells Code Complete is an MS publication, but I don't know what proportion of their engineers put his suggestions into practice. Viewed from the outside I can say with certainty that Excel 2007 is pretty horrific and unstable. Most people will never push it hard enough to notice though.

That is probably slightly unfair (but also partly true). It was the unruly Windows message API that eventually killed the strongly typed language interface for me. Just about every message was a pointer to a "heaven knows what object" that you had to manually prod and probe at runtime to work out its length and then what it claimed to be. Maintaining the string typed interface definitions even with tools became too much of a chore.

Imagine doing electronics where all components are uniform sized and coloured and you have to unpeel the wrapper to see what is inside. Worse still some of them may contain uninitialised pointers if you are unlucky, or let you write off the end of them with disastrous results.

Although I agree with you, I don't think you can blame all the problems on C. BASIC, COBOL and FORTRAN must share some of it.

But the lack of a reference abstract equivalent of the circuit diagram in programming (OK there are flow charts, dataflow and NS diagrams) but nothing that fully encapsulates the entire design adequately. And the early tools were over hyped, over priced and under performing.

I see that in a lot of engineers (including hardware ones). Some of the best programmers like programming because machines do exactly what you tell them to do - nothing more and nothing less.

Regards, Martin Brown

** Posted from
formatting link
**
Reply to
Martin Brown

What do you consider "paid well"?

--
Keith
Reply to
krw

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.