EE rant

On a sunny day (Fri, 6 Jan 2023 01:48:13 -0800 (PST)) it happened Anthony William Sloman snipped-for-privacy@ieee.org wrote in snipped-for-privacy@googlegroups.com:

Today I was watching NASA TV on Hotbird 13 East satellite It was about carbon nano tubes, they try to make electrical cables out of those to save weight in planes and spacecraft. Conductivity of those spun cables is not yet as good as copper, but strength is very good, they were winding those around tanks (fuel containers etc) to increase the strength, doing break tests. Interviews with the people doing the development, fascinating, a look in the labs. Microchip sensors that you can connect to your smartphone they did work on too. Smithosian channel seems to have gone black since Jan 1, maybe changed sat or transponder... Study never ends, you being brain dead and just posting others know nothing is something nobody will learn from, ever. Your own song is so loud you could not hear anybody else if you even had a hearing aid like a TV.

s
Reply to
Jan Panteltje
Loading thread data ...

Quite a lot of them do stagnate though or else move into man management (and some of them are absolutely useless at that even if they were once quite good at engineering in their youth).

Many of my contemporaries with first degrees sold their soul to the city and did the mathematics/coding to make the worlds stock markets unstable by using Bayesian methods to help hedge funds line their pockets.

Some of the others also went into other software development - notably CAD and computer gaming which was just taking off at the time. We did add-ons for various home computers for a while and then a Lisp compiler.

A first degree gets you enough numeracy to be highly employable in a wide range of other fields. My physics degree included a fairly comprehensive electronics components practical course with properties of semiconductor devices, opamps, oscillators, digital logic up to Karnaugh maps and a final challenge to build a digital dice from 74xx parts.

Second degree gets you enough knowledge to be employable in a more restricted field of higher paid jobs or to continue to do research. I chose to move out of academia at that point.

I went into software for large scale scientific instruments applying much the same calibration techniques honed in radio astronomy (feeble signal into insanely high gain drifty amplifiers) to mass spectrometry. The alternatives would have been military phased array radar or sub hunting. RSRE recruited directly from our group.

The city would pay a *lot* more for highly numerate computer programmers but very few of us were prepared to sell our souls just for the money.

My former supervision partner is a grey beard who keeps one of the largest US synchrotrons running. He was always more practical than me.

Another is big on plasma etch technology in silicon valley.

A few of my contemporaries are still in academia at very senior levels - the most successful are both leading cosmologists. It was reckoned when I was in academia that there was space for about 10% of those who were adequately qualified to remain in academic research and teaching.

I don't know what the figures are like for graduate now.

Regards, Martin Brown

Reply to
Martin Brown

It is around the best of classical methods, but Maximum Entropy which post dates M&W works considerably better on time series and image deconvolution. You get the image positivity constraint for nothing and avoid all sorts of nasty ringing artefacts in the deconvolution.

There is a slightly flaky introduction to it in Numerical Recipes 2nd or

3rd edition (I forget which).

There is no free lunch though - the price paid is that deconvolved image resolution becomes dependent on local signal to noise. Big spikes are much sharper and their location on the grid more precisely known.

Done well you get the sharpest image consistent with your raw data and artefacts due to noise are minimised. Overfit it and you can get into trouble (people often did).

The FFT on its own is just an orthogonal coordinate transform and preserves the data almost exactly give or take rounding errors.

It is dividing two transforms in the frequency domain that is the dangerous step and is best avoided or circumvented some other way.

In the old ways of doing it you modify the divisor *before* making the division. One popular trick was for an image with frequency coordinates i,j was to add lambda*(i*i+j*j) to the divisor and adjust lambda until the chisquared fit was equal to the expected noise in the raw data.

It worked pretty well. That hack is equivalent to requiring the resulting image to be as smooth as possible across adjacent pixels whilst still matching the raw data.

Modern methods use some form of penalty function like Maximum Entropy f.log(f) seeks to find a unique image that is in some sense the most probably one consistent with the data. Arguably it is the one true way or at least some practitioners would seek to convince you that it was.

Reply to
Martin Brown

Sounds like fun.

You may be "studying" but there's not a sign of you learning anything. You may not like this observation about you, and Flyugu and John Larkin resent much the same observations for much the same reason.

I'm much more tolerant of people who know more, and are open to the idea that they occasionally get stuff wrong.

That's a bizarre conceit. For one thing, I am not pushing "my own song" what ever that might be, but rather the more or less agreed concensus - not potty ideas like the Le Sage theory of gravity.

formatting link

Reply to
Anthony William Sloman

Who, because he/she/it needs to?

Designers sure do have funny names these days.

RL

Reply to
legg

LAM?

Reply to
John Larkin
<snip>

Elsewhere John Lark> Coding is an easily acquited skill. It requires no math, no science,

Coding is easily acquitted because it's so extremely difficult to write easily understood, robust software. See your first comment in this followup. But, your typo probably pertains to "easily acquired." Many people make the same mistake when they to try to convince themselves software development is easy. Again, see your first comment in this followup. Although it's easy to ride a bicycle with training wheels, street motocross is in a league of its own. Along with exceptional software development. Danke,

Reply to
Don

Yes.

Reply to
Martin Brown

Software is usually flame tested. Type fast and see if it will compile. Patch until it does.

Then actually run it and see what happens. In most modern cases, you can push it out to users overnight, and then wait for complaints about bugs. Maybe fix the worst ones.

FPGA design is more difficult than coding in c++ or javascript or whatever, so people are a little more careful.

A minor mistake on a PC board can with luck be hacked, but anything serious needs a board spin that can cost 10s of kilobucks and a month if really rushed.

The easier it is to change something, the sloppier the work will be.

I am amused by flux.ai, an attempt to apply software disciplines (irony alert) to electronic hardware design. They are hiring now, three "full stack" programmer types and one hardware summer intern.

Reply to
John Larkin

We once worked with a company in Fremont and designed an electro-optical thing for their plasma etch system. They stole our design.

Reply to
John Larkin

Are VHDL and Verilog more arcane (eg difficult) than freebsd-arm c? Your c++ reminds me of this:

C++ -> The COBOL of the 90s:

formatting link
Danke,

Reply to
Don

It's a different mindset, clocked non-procedural code, but not necessarily more difficult. But the tools are generally obtuse and a compile equivalent might take an hour or so, and then you have to test-bench it or load it onto the real hardware and debug with logic analyzers or whatever.

Between runs, you can do battle with FlexLM.

Your

I have people who love abstraction for its own sake and want to use C++ in realtime embedded apps. They won't.

Reply to
John Larkin

Unix was a whole lot better than JCL, and C was much superior to BASIC or Pascal. When I discovered those in the 1980's, I never looked back. Not even the Mac, with its revolutionary GUI, was a viable contender.

Everything they say in the Unix haters handbook is true, but believe me, the contemporary alternatives were far worse.

Jeroen Belleman

Reply to
Jeroen Belleman

Hey, people do just that here! I was flabbergasted when my 50 lines of C++ RT program turned out to compile into a 6MB executable. That's after stripping the symbol table. With it, it weighed 60MB! Crazy.

Any of my 'normal' C programs produce between 10 and 20 bytes of executable per line of source code.

Jeroen Belleman

Reply to
Jeroen Belleman

It sounds like a channel equalizer. The radar folk do that a lot as well. There are lots of designs, but FIR (finite impulse response) filters dominate, because their tail response is finite and can be out waited. Designed using Z-transforms and FFTs.

But if the FIR is short enough, one can do it in the time domain I suppose, if one isn't worried about fore and aft time sidelobes caused by periodic ripples in the passband response function.

What did the DSP Union say about that?

Sweet ...

Joe Gwinn

Reply to
Joe Gwinn

I'm thinking about using Pi Pico in some products. It has 2 Mbytes of flash and I'll need some of that for an FPGA config. Think bare metal state machines.

I'll need USB, Ethernet, a SCPI parser, and a bit of application code.

Reply to
John Larkin

Ho hum. They looked at what you thought was a design, and changed it enough to make it work. There are always two ways of looking at these sort of falling-outs, and the opinion of the guy who didn't think he got paid enough is only half the story.

Reply to
Anthony William Sloman

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.