scientists as superstars

Am 15.08.20 um 05:13 schrieb Les Cargill:

"Never underestimate the bandwidth of a station wagon full of mag tapes"

I think that was Andy Tanenbaum in "Structured Computer Organization",

35 years ago.

Gerhard

Reply to
Gerhard Hoffmann
Loading thread data ...

Yes, early x86 was microcoded.

Calling the workings of current X86 microcoded is such an oversimplification that it's completely wrong. Analyzing entire sequences of x86 instructions for data dependencies in hardware, feeding it to multiple ALUs and 100s of internal renaming registers has nothing in common with good old 360 style microprogramming.

cheers, Gerhard

Reply to
Gerhard Hoffmann

Reminds me of the NASA website late 1990s there was a advisory that some library photos were not available for immediate downloading and were subject to 20 minute retrieval time, presumably while the tape was loaded.

piglet

Reply to
piglet

Or a shirt pocket full of SD cards...

Reply to
Michael Terrell

It's not even really microcoded. It's transpiled to a different architecture in the pipeline. No wonder they need 40+ stage pipelines for that.

CH

Reply to
Clifford Heath

On a sunny day (Sat, 15 Aug 2020 21:36:27 +1000) it happened Clifford Heath wrote in :

The way I see those chips is as them having their own OS, complete with login and password for the NSA (forgot the password), but who needs it..

formatting link

formatting link

One would almost think they are drawing attention to themselves after that

formatting link

Best and safest is to design your nuke on a piece of paper, we had to do it at high school chem classes easy.

Reply to
Jan Panteltje

I've always liked C++. The OOP paradigm maps very naturally onto the sorts of coding I do: embedded, instrument control, and simulations.

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs 
Principal Consultant 
ElectroOptical Innovations LLC / Hobbs ElectroOptics 
Optics, Electro-optics, Photonics, Analog Electronics 
Briarcliff Manor NY 10510 

http://electrooptical.net 
http://hobbs-eo.com
Reply to
Phil Hobbs

Ditto in spades, except for C++.

I was doing primitive version of OOP in C around '82, for embedded machine control.

When I came across OOP in 85, I instantly recognised that two customer statements mapped directly onto OOP: - "I'd like three of those" => object creation - "just like that example, except" => class hierarchy And seeing what was possible in Smalltalk (container classes, reflection) made me a convert!

Unfortunately ParcPlace[1] Smalltalk was totally unsuited to embedded systems, so I looked out for alternatives.

In '88 I evaluated C++ and Objective-C. The latter is really Smalltalk without a GC, so rapidly became productive, using the available classes and adding my own. But C++ was dreadful; there was no class hierarchy and reversing any mistaken design choice was unnecessarily painful.

Then in the early 90s, I watched the C and C++ committees wrangling endlessly over pretty fundamental points /without/ there being /any possibility/ of adequately reconciling the two viewpoints. (Simple example: should it be allowed or forbidden to cast away a const declaration. There are good arguments for both, but the necessary choice has far-reaching implications)

At that point I realised the language was building a bigger castle on sand. Not a good position to be in.

Then, when I first used Java in '96, people were amazed at how quickly I could create complex applications (3D graphs of cellular system performance). That was because after only a couple of years, Java came with a large highly functional class library - something that C and C++ had conspicuously failed to manage in a decade. And you could simply plug in random libraries from random companies, intertwine them with your data, and it simply /worked/ as expected.

At that point I completely gave up on C++, on the basis that if C++ is the best answer, you've asked the wrong question!

And in the decades since it has been confirmed that typical C++ programmers are content with tools that usually work as they expect, e.g. Boehm's garbage collector and compiler optimisation.

So, the sooner we can "de-emphasise" C++ the better. Just like COBOL.

[1] Digitalk Smalltalk in ~90 was completely different, and was embedded into some HP equipment (and I believe Tek equipment).
Reply to
Tom Gardner

IOW you've never used anything newer than C++89. I agree with you about that language, but not about C++98 or newer. Most of my stuff is C++03ish but I'm warming up to the standard library and the C++11-17 features.

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs 
Principal Consultant 
ElectroOptical Innovations LLC / Hobbs ElectroOptics 
Optics, Electro-optics, Photonics, Analog Electronics 
Briarcliff Manor NY 10510 

http://electrooptical.net 
http://hobbs-eo.com
Reply to
Phil Hobbs

Used, no. But as I mentioned I kept an open eye (and mind) on what the committees were up to.

After 8 years of heated nattering, there was insufficient progress, especially compared with what was being achieved in other languages in a quarter of that time.

They couldn't get their act together, so I moved to languages where they /had/ got their act together. For simple embedded stuff I still used C.

Yebbut, even that late they hadn't got their act together w.r.t. threading.

Yebbut even in 2005 they hadn't, and many people had forgotten that they /couldn't/ - hence Boehm's paper

formatting link
formatting link

Finally, almost a quarter of a century(!) later in

2011, a memory model appeared. IMHO that that will take a long time to be proven sufficient and correctly implemented.

Quarter of a century is rather a long time for a language to be insufficient to implement a major library (Pthreads)!

If I'm being catty I'll ask if you use the same subset of the language as those around you and the libraries you use :)

How much of the C++ Frequently Questioned Answers has become obsolete now?

formatting link
That is a laugh-and-weep diatribe with too much truth.

My embedded stuff is now multicore and hard realtime. My preference is xC, since that has multicore parallelism baked in from the beginning, not bolted on as an afterthought. Unfortunately that is only on the (delightful) xCORE processors, so I also keep an eye on the progress of others such as Rust. Time will tell.

Reply to
Tom Gardner

I was using pthreads very successfully 2003ish. AFAIK the first microcomputer OS that supported multithreaded programming was OS/2 2.0 in 1992, so that's not 20 years in my book. Between 1992 and 2003 I was writing multithreaded C programs on the OS/2 and Windows OS thread APIs, which worked fine, partly because the compiler vendors also owned the OS. ;)

The most beautiful debugger I have ever used is the one that came with VisualAge C++ v. 3.08 for OS/2, circa 2001. Streets ahead of anything I've seen on Linux to this day--hit 'pause' and all the threads paused _right_now_, rather than the UI thread pausing now and the others not till the end of their timeslice. That worked even on my SMP box, not just on uniprocessors.

Incompatibility of third-party libraries was a serious problem in the beginning, for sure. Everybody and his dog had his own complex number type, for instance. Back then, my solution (which worked fine for my purposes) was to stick with , , and stuff I wrote myself.

Nowadays with namespaces and a much more capable standard library, there's much less reason for that. There are still warts, for sure, of which the one I love most to hate is . It's okay for light duty use, but the moment you try to do formatted output you disappear into the long dim corridors of , perhaps never to emerge. ;)

Dunno. I skimmed through it once iirc but wasn't that impressed.

Well, horses for courses. But tarring C++ in 2020 with a brush from

1989 is unpersuasive.

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs 
Principal Consultant 
ElectroOptical Innovations LLC / Hobbs ElectroOptics 
Optics, Electro-optics, Photonics, Analog Electronics 
Briarcliff Manor NY 10510 

http://electrooptical.net 
http://hobbs-eo.com
Reply to
Phil Hobbs

You had a beautiful tool set for the day though!

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs 
Principal Consultant 
ElectroOptical Innovations LLC / Hobbs ElectroOptics 
Optics, Electro-optics, Photonics, Analog Electronics 
Briarcliff Manor NY 10510 

http://electrooptical.net 
http://hobbs-eo.com
Reply to
Phil Hobbs

Sure, but that will have depended on things that are (were?) explicitly not defined in C. Pthreads operation would have been based on implementation-dependent behaviour.

That is inelegant at best, and fragile at worst. (Just like castles built on sand.)

Yes, multicore/thread debuggers aren't usually very good.

That's why I aim to get core functionality debugged in a single thread, and then rely on simple and predictable and reliable "high level" multithread/core design patterns.

That excludes rolling my own mutex-based code, and implies re-using well-conceived and well-tested libraries built on a solid base. Often those are RTOS libraries or libraries inspired by RTOS libraries.

Or boolean, or String, or..., and especially generic container classes.

My continually reinventing a wheel seemed a waste of my life, especially when the wheels were slightly off centre. Doubly so when it demonstrably wouldn't have been necessary if I had a better starting point.

As the old joke goes... "How do I get to the Blarney Stone?" "Well sir, if I wanted to go there, I wouldn't start from here".

I can believe that, and might be persuaded such complexity was inevitable and tolerable if buried in a library.

OTOH, I've head many stories about the interaction of core language features such as exceptions and templates. That just made me think "dragons; avoid".

It is a "many a truth is hidden in jest" type of diatribe.

There's validity to that, but... - the 1989 stuff is still there and visible - frequently you simply cannot use the latest stuff, for various corporate and technical reasons - it is still sand, albeit with a few piles driven into the ground :)

Life moves on, hopefully to better things.

Reply to
Tom Gardner

In the same way that a bacterium is sort of like a mammal. Both have DNA. The similarity ends there.

Reply to
Clifford Heath

If you would care to compare and contrast Java and UCSD Pascal, I'd read with interest.

Cheers

Phil Hobbs

Reply to
pcdhobbs

I could, but don't care to say much. Any bytecode is like any other for the most part, but JVM is designed to JIT, and the level of sophistication in HotSpot is analogous to the mammalian superstructure. It really is an immense tower of technology. Personally I prefer the AOT idea to JIT, but I respect the achievement.

CH

Reply to
Clifford Heath

My point. Pretty far from your earlier response, which I reprise:

">> Sort of like UCSD Pascal, circa 1975. ;)

So any bacterium is like any mammal "for the most part." good to know! ;)

Cheers

Phil Hobbs

Reply to
pcdhobbs

Never confuse HotSpot with JIT.

JIT is a runtime peephole optimiser, and hence uses only local information about the code emitted by the compiler. That code is based on what the

compiler can guess/presume about the code /might/ behave.

HotSpot looks at what the code is /actually/ doing, and optimises the shit out of that.

AOT is HotSpot without knowing what the code will do. It optimises for the instruction set in a particular processor (and there are many variants between AMD/intel!) I don't know how it deals with processors being changed after installation, e.g. for all the recent cache-timing malware attacks.

HotSpot and JIT can take account of "removed" processor functionality, by replacing the runtime.

AOT is the only game for embedded.

HotSpot has major advantages elsewhere.

Reply to
Tom Gardner

Why did it surprise anybody?

I don't think that at all. That's not necessarily a reasonable standard to boot. You can *reliably* produce perfectly functional work product with them, without knowing a whole lot about what's under the hood (mostly ) and without a whole mass of pain.

Once you find the few rocks under the water...

You'll get no argument here. But all things which have too much light on them end up in Mandaranism.

And the best tools to inspect memory are built into the running application itself.

Yep; yer right.

Agreed. I really expected better progress, but you know how we are...

--
Les Cargill
Reply to
Les Cargill

Close enough:

formatting link
't%20get%20machine%20code.

"You get Intermediate Language code which the runtime then compiles and executes."

I'd call that an implementation detail; it does not load the image into memory then jump to _main.

The point of my comment is that both Java and C# are considered "managed languages", especially for security purposes. I suppose somebody, somewhere is writing virii in C# but ...

Exactly.

--
Les Cargill
Reply to
Les Cargill

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.