EE rant

(Talk about leading with your chin!)

Cheers

Phil Hobbs

Reply to
Phil Hobbs
Loading thread data ...

On a sunny day (Wed, 4 Jan 2023 08:39:59 -0500) it happened Phil Hops wrote

Were you trying to commi-nukate something?

Reply to
Jan Panteltje

It's a boxing expression. Normally a boxer takes the first punch with his right or left fist. "Leading with your chin" means putting your chin in first, i.e. just asking to be knocked out.

In this case, the far-famed illegibility of your schematics is the "chin" in question. Capiche?

Cheers

Phil Hobbs (Who obviously needs to be more explicit when teasing ESL folks.) ;)

Reply to
Phil Hobbs

Advanced engineering mathematics:

formatting link

Which is pretty advanced, I don't know how many BS-type EEs know about the orthogonality of Bessel functions, or regularly use contour integration for anything.

But not as advanced as "Advanced Mathematical Methods for Scientists & Engineers", which is largely about perturbation methods, boundary layer theory, and WKB approximations. Sounds fun I guess, I just got a used copy from Amazon for $8

Reply to
bitrex

I would expect stuff like the WKB approximation is regularly used more in optics design than in circuit design, though.

Reply to
bitrex

I actually took an integral to compute a mosfet power dissipation. That was 10 or 12 years ago. Now I use Spice.

Reply to
John Larkin

I once used contour integration to obtain a fringe field correction on a mass spectrometer magnet. The objective was to take out the first order aberrations and make the focal plane orthogonal to the optic axis.

It was one of the first electromagnetic optics codes where the magnitude of the predicted voltages on electrodes was sometime right. Prior to that you were lucky if it had the right sign! The original code came off a mainframe and was intended for designing atom smashers. A listing arrived at the company from academia with my new boss.

Physics was mainly into Chebyshev polynomials for solving wavefunction equations since it housed one of the world experts in the field.

A bit like Green's function I'm inclined to think that WKB is seldom used at all now that we have very fast raytracers on the desktop PC. It may still be taught at undergraduate level today but mainly to weed out those that are not going to make it as a theoretical physicist (which is where it was used back in my day as an undergraduate).

Padé rational approximation methods are undergoing something of a Renaissance. Things go in cycles. I keep waiting for Clifford Algebras to take off as my supervisor promised they soon would (~2 decades ago).

Things which do have an important place in modern software that is intended to be provably correct are invariants (borrowed from physics).

Reply to
Martin Brown

On a sunny day (Wed, 4 Jan 2023 09:35:06 -0500) it happened Phil Hobbs snipped-for-privacy@electrooptical.net wrote in snipped-for-privacy@electrooptical.net:

I did look up the expression with google before I replied to you in your photo cell, some very different explanations are given, one of those is 'being aggressive'.. I get the origin now. I am still learning you earthling's stuff,

Of course studying my circuit diagrams does require some knowledge of tronix..

Well, that re-encode was one reason I could read them and others not. It surfaced when I downloaded my picture from my website and a 10 MB picture of the circuit diagram was reduced to a 1 MB one or something. Further test with new pictures gave the same effect. They changed server hosting company after that I think... I have posted about that here.

No is OK, I can lead with chin, have laser sword!

Reply to
Jan Panteltje

You need to be able to do contour integration in a whole lot of signals and systems. For instance, the proof that instability in a linear system is the same as acausal behavior depends on it.

The exp(i omega t) in the Fourier integral means that you have to close the contour in one half plane for positive time and the other for negative time. If there are any poles inside the negative-time contour, you get acausal response and exponential growth. (A very pretty result first proved by E. C. Titchmarsh, I think.)

That's Bender & Orszag, right? By far my favorite math book of all time. I just _love_ that one. The prof for my (first year grad) asymptotic methods class was a former EE (Stephanos Venakides, may his tribe increase). That helped a lot. Math classes taught by mathematicians tend to be dry, because they regard the subject like philosophy, whereas to a scientist or engineer, math is a technology of thought.

BITD Arfken's "Mathematical Methods for Physicists" was one of the standard math books for undergraduate physics, along with Levenson & Redheffer's complex variables book, Boyce & di Prima on ODEs, Carrier & Pearson for PDEs, and something on linear algebra. My linear alg class was taught out of Schaum's Outline, believe it or not--super cheap and actually a pretty good book. Oh, and a little book on the theoretical side of calculus, so that you can prove theorems and stuff if you need to.

Fourier analysis, perturbation theory, asymptotic methods, cluster expansions, tensor calculus, and Feynman path integrals were all taught in physics classes. I took four EE classes in grad school--Tony Siegman on lasers, Steve Harris on nonlinear optics, and Ron Bracewell on how to think in k-space (aka reciprocal space and Fourier space), and Bernie Widrow on DSP.

Cheers

Phil

Reply to
Phil Hobbs

WKB is common in approximate quantum theory, e.g. solid state.

Cheers

Phil Hobbs

Reply to
Phil Hobbs

GCC remains well of the pace 30% to 2x slower depending on the code. I have just downloaded the latest Intel 2023 version to test.

I found the MS one was much more cunning on complicated difference expressions where the Intel one was often generating pipeline stalls of around 80 cycles. But it is very sensitive to code generation options and linker conventions SSE2, AVX, AVX2 and AVX512 are each worth a try and some are much faster. The two snippets that show the most bizarre speed variations with compiler options are Gooding's S3 & S9 starters.

Worst case it does all the work nicely in SIMD instruction and then slams the results into memory and loads them onto the x87 stack as a function return! The store and immediate load waits for memory ready.

The worst x86 compilers generate x87 code for the entire last line of every function which is then much slower than SSE2 or AXV code.

double Gooding_S3(double e, double M) { return M + e*sin(M)*(1+e*cos(M)); // slow and fairly useless }

double Gooding_S9(double e, double M) { // originally written in their paper as sin(M)/sqrt(1-e*cosM+e*e) // it is one form of Halley's method but also derived as the root // of a simple quadratic approximation for E-M. Rather good and fast!

double s, y; y=1-e; if (M == 0.0) return M; // defend against divide by zero s = sin(M/2); return M+2*e*s*sqrt((1-s)*(1+s)/(y*y+4*e*s*s)); // or return M + sin(M)/sqrt(y*y+4*e*s*s) }

The really odd thing is that sometimes the much more complicated and accurate S9 code runs faster than the simpler S3 code. Even having examined the generated code I can't see why this should be but empirically that is what I observe when benchmarking them. I suspect it is that the S9 computation parallises in hardware more cleanly.

sqrt on 12 series Intel CPUs benchmarks as faster than divide - I have no idea how they do that! It must be an artefact of how I benchmark.

Reply to
Martin Brown

Well, tell that to my friend who almost died from it. And still could because medically they can only give him something to prevent further clots. The existing clots have to dissolve on their own. Hopefully ...

Reply to
Joerg

I am not at liberty to go into great detail but in a nutshell the DSP was there to calibrate a multi-channel RF system via FFT with respect to amplitude and phase. High precision was required. Theoretically it could, of course, be done with the FFT but it took way too long and it didn't always converge to the precision they needed. The sofwtare also was, let's say, a bit temperamental.

My time-domain routine didn't need any golden numbers and converged every single time within less than half a second. We let the uC handle that because the computational load dropped to peanuts. The big DSP became unemployed.

The project start was the usual, everyone saying that FFT was the name of the game and there wasn't any other decent way. If it didn't work in time domain I'd have to buy everyone a beer at night. If it did, everyone had to buy me a beer. I needed a designated driver that night ...

Reply to
Joerg

That problem I don't have. It already fell out shortly after turning 20.

Too much global warming :-)

IIRC they downsized it and saved a bundle in the wake.

Reply to
Joerg

The power pad thing is easily doable if you use paste and a hot plate. For hand soldering, if you put a bit of flux on the pad beforehand, you can wick solder up the thermal vias fairly well. A bigger PTH makes it easier.

Cheers

Phil Hobbs

Reply to
Phil Hobbs

I did it for just about everything. Except when the bus was being redlined in terms of what the driver chips could do and the timing was being "tweaked" but IMO that's bad design to begin with.

Forgot the name of it but National Semiconductor made a nice clamp IC to avoid black level droop. There were als Japanese ones but I never used them in product designs because availability was iffy and many clients were married to certain local distributors for reasons I'll never understand. Those usually didn't stock the Asian black level clamp chips.

The trick is to never use much more than the minimum capacitance for the given bus length. Maybe 50-100% margin but not huge margins. That also cut down on the power dissipation for fast-changing signals. Can be calculated or simulated but sometimes I had a small variable capacitor in my briefcase. With ceramic carrier, an L-shaped stand and a big imposing knob. It was quite the head turner in the labs.

Reply to
Joerg

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.