EE rant

That's true in the short term but the thing to actually do is put instrumentation in so you can measure what the thing actually does.

It's gone a bit beyond that in practice. Depends on what "push it out to customers" costs. what MSDN showed Microsoft was that not only would people clamor to be the first inline for patches, they'd pat for the privilege.

If a (noncritical) customer can push a few buttons and see results, they get a nice buzz from that and love you even more.

It's a different mindset and from what I've seen, the tools are angry arbitrary gods of constraints with horrible error messages.

Well, programming is also managing constraints. The gods are just more laid back.

Time for a nice scotch and sleep on it then. You have a nice budget for letting the design ferment.

I don't hold to that. But I've worked with many FPGA guys, and have adopted their habits.

You run from "sloppy" while the idea coalesces and then you get less sloppy over time. Hopefully, that involves taking measurements to make sure you're not lying to yourself.

I studied under an old Collins Radio guy for close to a decade. It wasn't a constant-study thing, but he was there. Very valuable, even though I don't spin hardware. That was me being polite; others needed the work more.

A kid now can be programming at an early age, so the metaphor sets earlier. Parents are not like my parents, tolerant of soldering irons and circular saws and the mess that goes with that.

People find kids pecking on computers charming. There's less mess and the circuit breakers don't trip.

People follow in the track they are set towards.

Reply to
Les Cargill
Loading thread data ...

Anything was superior to IBM's JCL or TSO - that was truly the pits. Phoenix used PDP-11s as terminal concentrators for the IBM mainframe.

formatting link
There wasn't too much wrong with the mainframe hardware.

Wirth's later Modula2 was a brief contender for a while. The Lilith machines in Zurich ran it natively. World's first PCB CAD system too.

VAX/VMS was pretty good for it's time (which was about then), DEC10 OK too. It was IBM mainframes that really thought that users had to suffer.

Norsk Data's SINTRAN was the most painful OS I ever encountered. Just to be different it's copy syntax was copy <dest> <source>

(and there was no warning on overwrite)

It was an RTOS written in FORTRAN by scientists for scientists. It's behaviour could sometimes be described as "interesting" (Chinese usage).

Reply to
Martin Brown

It tends to come in very big chunks including friends of friends if you don't have your own small custom IO library(s).

But they still get a lot bigger if you use classic stdio etc. Just not quite as humongous as C++ tends to get.

The main thing I use C++ for is operator overloading so that certain compound user types can be used in equation like constructs.

Reply to
Martin Brown

I have to agree to some extent. I have seen C coders who use the random application of casts until their dodgy code gets past the compiler.

At least now the latest generation of compilers will spot a decent fraction of uninitialised variables and constructs that probably do things other than what the author intended.

if (x=0) y=10; // sets x to 0 assignment to y is unreachable

for example. Or the infamous Fortran do loop that killed a space probe.

DO 100 I=1.3 where the intention was to loop 3x "." instead of ","

What it actually did was set variable DO100I equal to 1.3

It depends on the scale of the software. The bigger it is the longer the development cycle.

Software bugs that get out into the wild can also have enormous costs to fix. The sooner they are found the cheaper they are to fix. Better still to use languages that can catch the most common human errors.

I do largely agree that the internet has made it even more common to ship defective code knowing that it can be patched remotely at little cost to the supplier (but sometimes considerable pain to the end user).

That was always true. Software does have more than it's fair share of the ship it and be damned culture but it is senior management wanting to hit delivery targets and get their quadratic sales bonus that is the root cause. The engineers are more cautious about shipping duff code.

FWIW My irony meter has also pegged past the end stop.

I was at one time involved in a project to bring hardware engineering disciplines into software development. It made some difference but not long after I left the project was abandoned. The software repository of known working algorithms degraded with time and the joke was s/re/su/.

The idea was to not have every factory reinventing much the same wheels.

Software development even today is a bit like medieval cathedral building before they fully understood foundations and materials stress. If it is still standing 5 years after completion then it was a good 'un.

The odd one like Durham and Ely were sufficiently over engineered or quickly adjusted to avoid falling into the river or down. Or the 13th C church in Chesterfield with a crooked spire that is still standing.

formatting link
Like the leaning tower of Pisa it is on borrowed time.

Reply to
Martin Brown

I dabbled a bit in Modula2 for a while. I dropped it when I realized that it was more work to keep the compiler happy about types and program structure than it was to write the code that implemented the actual functionality of the program.

C assumes you know what you're doing, with suited me better.

Jeroen Belleman

Reply to
Jeroen Belleman

That's funny, clearly code that somebody wrote and nobody read. One difference between software and hardware is that we brutally review hardware designs, and PCBs usually work first try. We also simulate or breadboard and test any bits that we don't fully understand.

In the days of assembly coding and clumsy editing and printed listings, we *read* our code before we ran it. And commented heavily. That was the old-fashioned concept of engineering code like we engineer hardware. Iteration speed now lets us code faster and find bugs faster, but it doesn't encourage being careful. Good enough, I guess.

The schematic on their home page is hilarious. Where can I buy a

2N2000? I've told them about the blunders but I suspect they had some outsider fake the page and they can't fix it. Tilting schematic! Mine can tilt, at least when they are still on paper.

Software tends to have a git-hubby collaborative team model, where everybody edits and nobody is responsible.

Architects are designing spiral buildings now. Chesterfields was ahead of its time.

Reply to
John Larkin

Yes, but if I recall we called them Assertions:

.

formatting link

Software also has Invariants, but I don't know that either one came from the Physics world.

.

formatting link

The main difference in software seems to be that assertions are logical statements about the value of a single variable, while Invariants apply an assertion to the result of a specified function.

One kind of assertion was visual - coplot a 2D plot of something, plus a circle, and visually verify concentricity. The eye is _very_ good at this, so it was a very robust and sensitive test.

I for one used them heavily, with some being used in operation, not just development. This was done in runtime code, not as a property of the programming language and/or compiler.

Joe Gwinn

Reply to
Joe Gwinn

Keed?

Very interesting. I would be tempted to use the Penrose Pseudoinverse (which has an SVD within), for its error tolerance.

.

formatting link

I can see this working for reasonably small data sets, as the SVD scales an O[n^3) or so. But the DFT (using FFT) scales as O(N*logN).

.

formatting link

Joe Gwinn

Reply to
Joe Gwinn

Clockmaking?

Joe Gwinn

Reply to
Joe Gwinn

Thus, the analog computer is reborn! No better simulation of analog devices need ever be sought, accuracy-wise, but there are still the familiar analog computer drawbacks: such a computer is strictly Harvard architecture, no self-modifying code allowed.

Reply to
whit3rd

Good o'l Kreyszig....I know of 3 engineering schools that use this text in their freshman a sophmore EE/MechE/CE math courses. Causes a lot of sleepless nights...

Reply to
Three Jeeps

I like this list. And those are the big names.

I've actually used contour integration, but very rarely.

I think I did know that Bessel functions are orthogonal. As are Zernike polynomials (used in optics), but haven't actually computed one. The orthogonality was a requirement, not an accident, and the orthogonal polynomials form the basis vectors of a function space .

Joe Gwinn

Reply to
Joe Gwinn

As mentioned elsewhere, this issue is scalability: EE applications can easily require 2^20 sample (I+Q) transforms.

In X-ray Computerized Tomography (CT), computations were very slow because of the need to impose a non-negativity constraint on pixel X-ray absorption values - the target substance has no X-Ray power gain. This was done by a brute-force iterative process.

Nowadays, they use fast Radon transforms. I assume that they alternate between domains, and impose the constraints in whichever domain makes physical sense.

That makes sense.

Even if there was a written recipe, it would take time to rediscover the lore that people just know.

Joe Gwinn

Reply to
Joe Gwinn

No . It was reviewed code in a process that for the time was much better than average. The problem is on a chain printer the difference between a comma and a full stop is not very large and you read what you expect to see based on the layout. The compiler reads what is actually there!

Embedded spaces have no significance in Fortran 66/77 programs unless they don't occur in column 6 (continuation card).

IMPLICIT NONE was the defence against such constructs but it wasn't possible to do that until F77. For compatibility in Fortran variable names the default was A-H,O-Z were reals and I-N were integers.

I still pretty much follow that rule even today in C/C++ not least because I am often contracting vectors & matrices over indices i,j,k.

In the old days a compile cycle was sufficiently tedious that you tried to get as many faults out in each batch run as you could. I think things took a turn for the worse when online terminal access became more common and PCs with a 25 line display took over. Coherence length of software reduced to the chunk that you could see on screen at any one time! :(

Git hub is a relatively recent innovation. Major tested numerical algorithm libraries like NAGlib have been around for much longer. Trouble is that people reinvent the wheel again and again...

Reply to
Martin Brown

Am 09.01.23 um 10:57 schrieb Martin Brown:

God is real, unless declared integer!

:-) Gerhard

Reply to
Gerhard Hoffmann

Exactly. People were more careful. Like we are still with hardware design.

Reply to
John Larkin

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.