scientists as superstars

On a sunny day (Tue, 11 Aug 2020 14:00:18 +0100) it happened Martin Brown wrote in :

I agree, funny you mention Z80, got an email from somebody a few month ago they are using my dz80 disassembler..

formatting link
have had more mail about that in the past. And this newsreader is from 1998 basically (with some new stuff added later when they attempted to make Usenet html but that html did not catch on....):
formatting link

Some webcam software I wrote is used for what you just described.. About 0) I programmed a lot of PICs, in asm, close to the hardware. In a very small code space you get near zero boot time, near zero power consumption and very high speed. Try real time video processing on one of those :

formatting link

So the hardware it runs on is extremely important! The java way where it is not, today I had to work myself through 2 bank sites and it always gets your adrenaline going so slow and weird are the logins, is my browser still good enough? So and to run the latest browser you need to update the OS and the OS wants better hardware...

As to compiler warnings in gcc I always use -Wall but when porting code from x86 to ARM found gcc gives mysterious warnings, one I have not been able to resolve, googled for that warning, others had it too, finally left it that way, For the rest most I have written does a clean compile, unlike the endless warning listings I have seen from others.. Compilers are not perfect, I like asm because there is no misunderstanding about what I want to do. And really, asm is NOT harder but in fact much simpler than C++. C++ is, in my view, a crime against humanity. To look at computing as objects is basically wrong. It gets worse with operator overloading and does not stop there. And indeed the compiler writers hardly agree on what is what. C is simple, you can define structures and call those objects if you are so inclined. you can specify the bit width and basically everything, Anyways am getting carried away, Would it not be nice if those newbie programmers started with some embedded asm, just to know what is happening under the hood so to speak.

Reply to
Jan Panteltje
Loading thread data ...

tirsdag den 11. august 2020 kl. 16.50.28 UTC+2 skrev snipped-for-privacy@highlandsniptechnology.com:

what's the definition of "bad code"?

what are the rules?

Reply to
Lasse Langwadt Christensen

That's your best deadpan line to date. Keep 'em coming!

Cheers

Phil Hobbs

Reply to
pcdhobbs

Code that can contain or allow viruses, trojans, spyware, or ransomware, or can modify the OS, or use excess resources. That should be obvious.

A less severe class of "bad" is code that doesn't perform its intended function properly, or crashes. If that annoys people, they can stop using it.

Don't access outside your assigned memory map. Don't execute anything but what's in read-only code space. Don't overflow stacks or buffers. Don't access any system resources that you are not specifically assigned access to (which includes devices and IP addresses.) Don't modify drivers or the OS. The penalty for violation is instant death.

Let's get rid of virtual memory too.

Some of those rules just make programmers pay more attention, which is nice but not critical. What really matters is that the hardware and OS detect violations and kill the offending process.

Hardware designers usually get things right, which is why FPGAs seldom have bugs but procedural code is littered with errors. Programmers can't control states, if they understand the concept at all.

Most of the protections we need here were common in 1975. Microsoft and Intel weren't paying attention, and a culture of sloppiness and tolerance of hazard resulted.

--

John Larkin         Highland Technology, Inc 

Science teaches us to doubt. 

  Claude Bernard
Reply to
jlarkin

Absolutely.

Wasting some execution speed on a pseudocode approach is worthwhile. The x86 runtime can be made more reliable than random machine code compiler applications could ever be... until we get rid of x86. That would be a Python-like language (which is, in appearance and implementation, awfully similar to Dec's Basic-Plus, which was bulletproof.)

--

John Larkin         Highland Technology, Inc 

Science teaches us to doubt. 

  Claude Bernard
Reply to
jlarkin

program as well... Anybody who can throw coal into a furnace can learn

All too funnny.

--

John Larkin         Highland Technology, Inc 

Science teaches us to doubt. 

  Claude Bernard
Reply to
jlarkin

In that case it would be unnecessary and impossible for intel to change the operation of their processors after the processors are operational after being installed on boards in customer premises.

Intel does just that; it is the key to their being able to (partially) contain and mitigate the recent security flaws. Fundamentally the x86 ISA is unchanged, but the implementation of the ISA is changed.

Er. No, those systems most definitely weren't bulletproof.

Some of my friends comprehensively owned the university PDP11 in the late 70s. The university sysadmins were never able to nail the perps, nor were they able to regain control.

The perps used many techniques, including replacing the system monitoring and control programs with their own doctored versions. Naturally part of the doctoring was to prevent the programs from being able to detect they had been doctored, nor to detect the extra programs that were always running.

Another friend also managed to subvert a DEC VAX a few years later.

Fundamentally, if you have physical access to a machine, it is game over!

Reply to
Tom Gardner

There are several examples of such. Z and VDM are amongst the foremost specification proof languages that can do it if used correctly. The snag is it takes a very skilled trained mathematician to use them and they are in short supply. We need to deskill programming to a point where the machine takes on some of the grunt work that catches people out.

Even wizards make the occasional fence post error it is just that we

*expect* to make them sometimes and test for any rough edges.

It just makes it so that when it dies horribly it doesn't take anything else with it. OS/2 had pretty effective user mode segmentation defences Windows effectively dismantled them and did a VHS vs Betamax takeover.

Sometimes you need to break the rules to get things done. Mickeysoft in its infinite wisdom withdrew the ability to use 80 bit reals in C after v6.0. I used to keep a copy on hand for awkward numerical work. I have now made a small library of routines that will subvert the CPU into the

80 bit mode when I really need that extra precision. Unfortunately it means I have to look at the optimiser output and sometimes hand tweak it so that all crucial intermediate results stay on the stack.

One interesting observation on IEEE FP based on my recent work is that although 2^x-1 and y*log2(1+x) is implemented very nicely and cos(x) is OK because the identity cos(x) = 1 - 2sin(x/2)^2 allows cos(x)-1 to be computed reliably to full numerical precision.

No such easily computed closed expression is available for x-sin(x) a functional form that appears in several important physics problems. Most practitioners are forced to roll their own and some inevitably get it wrong. You have to add the terms together from smallest to largest to avoid cumulative rounding errors from distorting the result.

It was ever thus. All that has changed is the frequency and gulp size of updates - some really hurt now when you are on a wet string connection.

Probably because there is so much of it about and many US lobbyists.

--
Regards, 
Martin Brown
Reply to
Martin Brown

That's just another variable type in PowerBasic.

--

John Larkin         Highland Technology, Inc 

Science teaches us to doubt. 

  Claude Bernard
Reply to
jlarkin

technology.com:

n

es

.

have

nt that

n.

at

ous

de

tus

ges

t

An interesting dichotomy. It is ok to disrupt the world's computing resour ces that the world economy depends on in order to prevent computer viruses, but it's not ok to disrupt the economy in countries that a virus is killin g 10 thousand people per week.

I use a programming language called Forth. Some of the Forth programming e nvironments are recognized by AVS as infected when it is not. It is virtua lly impossible to get the AVS companies to provide any info on how to write code to prevent false positive detection. It becomes a guessing game.

I think this silly idea would result in nearly every program being flagged as "bad" in one way or another.

The company, the developer or the user?

You can still run CPM on a Z80 if you'd like. They are pretty fast these d ays... opps, that's Z80 emulations on real computers.

Obviously this is needed because there is zero incentive to make software n ot crash presently.

I really don't know about this guy. A lot of times he sees the world throu gh crap colored glasses. My laptop virtually never crashes other than the Microsoft mandated crashes it does periodically to update the OS.

LTspice is the biggest crasher on my system. Should LTspice be blocked fro m running?

FPGAs have fewer bugs because they can be tested better and are typically a lot more simple than the hardware they run on. It is very hard to test al l the millions or billions of permutations in the software.

If you want software to be more reliable, don't ask it to do such complex t asks.

So you must still be running CPM then?

--

  Rick C. 

  -++ Get 1,000 miles of free Supercharging 
  -++ Tesla referral code - https://ts.la/richard11209
Reply to
Ricketty C

Seconded. My boxes generally have minimal swap space.

It's got a lot harder to do since 1975. See e.g. this very readable and illuminating paper, entitled "C is not a low-level language. Your computer is not a fast PDP-11."

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs 
Principal Consultant 
ElectroOptical Innovations LLC / Hobbs ElectroOptics 
Optics, Electro-optics, Photonics, Analog Electronics 
Briarcliff Manor NY 10510 

http://electrooptical.net 
http://hobbs-eo.com
Reply to
Phil Hobbs

Yup.

I can't remember the last time my programs exceeded physical memory. disk is the new tape dram is the new core cache is the new ram and I'm not sure where NUMA fits into that.

Regrettably not.

Human stupidity, laziness and misunderstanding are constants.

Yes, C hit an abstraction (all the world is a PDP11) that was good for a decade, but since then has caused untold pain.

New computation models are like scientific theories: they change one (programmer) death at a time.

The new computational models presume multicore and distributed processing. Good.

Now all we have to do is get programmers understand the concepts of partial system failure, that "a single universal time" is heretical, and the eight laws of distributed programming.

Reply to
Tom Gardner

There is nothing much wrong with the Intel hardware it has been able to support fully segmented protected address spaces for processes ever since the 386. IBM OS/2 (and to a lesser extent NT) was quite capable of terminating a rogue process with extreme prejudice and no side effects.

Other CPUs have more elegant instruction sets but that is not relevant.

The trouble is that Windows made some dangerous compromises to make games go 5% faster or whatever the actual figure may be. There is far too much privileged kernel code and not enough parameter checking.

In addition too many Windows users sit with super user privileges all the time and that leaves them a lot more open to malware.

More like medieval cathedral builders making very large buildings - if it is still standing in five years time then it was a good one.

Ely and Durham cathedrals came uncomfortably close to falling down due to different design defects. Several big UK churches famously have crooked spires to say nothing of the leaning tower of Pisa.

--
Regards, 
Martin Brown
Reply to
Martin Brown

It is very obvious that you have no understanding of the basics of computing. The halting problem shows that what you want is impossible.

You cannot tell reliably what code will do until it gets executed.

Most decent software does what it is supposed to most of the time. Bugs typically reside for a long time in seldom trodden paths that should never normally happen like error recovery in weird situations.

C invites certain dangerous practices that attackers ruthlessly exploit like loops copying until they hit a null byte.

That is motherhood and apple pie. It allows other programs and tasks to keep running and was one of the strengths of IBM's OS/2 but apart from in bank machines and air traffic control hardly anyone adopted it :(

IBM soured the pitch by delivering it late and not quite working and conflating it with the horrible PS/2 hardware lockin that forced their competitors to collaborate and design the EISA bus the rest is history.

You are going to have a lot of time wasting checking against all these rules which will themselves contain inconsistencies after a while.

Why? Disk is so much cheaper than ram and plentiful. SSDs are fast too.

One that you can do either in hardware or software is to catch any attempt to fetch an undefined value from memory. These days there are a few sophisticated compilers that can do this at *compile* time.

One I know (Russian as it happens) by default compiles a hard runtime trap at the location of the latent fault. I have mine set to warning.

Oh rubbish. You should stop using simulators and see how far you get - since all software is all so buggy that you can't trust it can you?

Intel hardware has the capability to do full segmented protected modes where you only get allocated the memory you ask for and get zapped by the OS if you try anything funny. But the world went with Windows :(

I blame IBM for their shambolic marketing of OS/2.

--
Regards, 
Martin Brown
Reply to
Martin Brown

Assuming it is a Pentium class machine and you don't do much video editing, 3D rendering or gaming you may find that for 2D graphics the built in Intel graphics 4000 is actually faster at 2D than most high performance graphics cards. No use to you at all if you are using programs that subvert the graphics card to do computation though.

There may be a way in your BIOS to disable the Nvidia card temporarily and check it out. Its a waste to be running a texture rendering engine when all you are doing is graphs and web browsing.

My office machine uses Intel 4000 graphics only and consumes under 60W when not working hard and 100W flat out. A graphics card would double or triple that consumption. Only snag I see is that I cannot run the latest AI chess engines on it since they do require a graphics CPU cluster.

--
Regards, 
Martin Brown
Reply to
Martin Brown

I would have tried that, but my AMD 3700X doesn't have inbuilt graphics - so it has to be an external card.

Reply to
Tom Gardner

I've written maybe a million lines of code, mostly realtime stuff, and three RTOSs and two or three compilers, and actually designed one CPU from MSI TTL chips, that went into production. I contributed code to FOCAL (I'm named in the source) and met with some of the guys that invented the PDP-11 architecture, before they did it. Got slightly involved in the dreadful HP 2114 thing too.

Have you done anything like that?

Bulletproof memory management is certainly not impossible. It's just that not enough people care.

"Computer Science" theory has almost nothing to do with computers. I've told that story before.

You can stop it from ransoming all the data on all of your servers because some nurse opened an email attachment.

The real dollar cost of bad software is gigantic. There should be no reason for a small or mid-size company to continuously pay IT security consultants, or to run AV software.

Let bad programs malfunction or crash. But don't allow a stack or buffer overflow to poke exploits into code space. The idea of separating data, code, and stack isn't hard to understand, or even hard to implement.

We probably need to go to pseudocode-only programs. The machine needs to be protected from programmers and from bad architectures. Most programmers never learn about machine-level processes.

Or push everything into the cloud and not actually run application programs on a flakey box or phone.

My point. Why do you call me ignorant for wanting hardware-based security?

The problem circles back: the compilers are written, and run, the same way as the application programs. The software bad guys will always be more ceative than the software defenders.

I've done nontrivial OTP (antifuse) CPLDs and FPGAs that worked first pass, without simulation. First pass. You just need to use state machines and think before you compile. People who build dams understand the concept. Usually.

Have you ever written any code past Hello, World! that compiled error-free and ran correctly the very first time? That's unheard of.

--

John Larkin         Highland Technology, Inc 

Science teaches us to doubt. 

  Claude Bernard
Reply to
jlarkin

I'm sure you have done that, but Martin is correct.

The core problem is fundamental: data is numbers and programs are numbers. The only difference is in how the numbers are interpreted by the hardware.

So, to ensure bulletproof memory management, you have to ensure data cannot be executed. That rules out things like JITters and general purpose compilers.

I've never used them but I /believe/ the only computers that achieve that are the Unisys/Burroughs machines, by ensuring only their compilers can generate code that can be executed - and keeping the compilers under lock and key.

It does, however, put solid limits on what computers can and cannot achieve.

One hardware analogy is Shannon's law, but there are others :)

People that blunder into electronics and make statements equivalent to breaking Shannon's law are correctly regarded as ignorant cranks.

That's what anti-virus packages *attempt* to do. And my, don't they work well.

Yup.

Then you need to protect the pseudocode interpreter, and you are back where you began.

Good luck with that AI project.

"Oooh goodie" say the malefactors. A single attack surface :)

Your desire is understandable.

Your proposed implementation cannot work as you wish.

Whether it would be better than current standards is a different question.

Yup.

Plus the malefactors are highly incentivised, whereas the capitalist business imperative doesn't incentivise the good guys.

Good luck fixing that :(

Martin is correct.

I've created a semi-custom IC design with a three-month fabrication turnaround, which worked first time.

Yes, I have.

Reply to
Tom Gardner

On a sunny day (Wed, 12 Aug 2020 07:30:18 -0700) it happened snipped-for-privacy@highlandsniptechnology.com wrote in :

That is silly, For me I normally write incremental code, one part at the time, thousands and thousands of lines of code that simply work. In C, or asm Any compiler complaints are usually typing errors.

For the code I release as open source you can see it for yourself.

There is more in the world than state machines. Maybe the problem is that many people cannot think logically, those should not be programming, but are likely good at other things.

The other thing is that over the years you build up a collection of routines you can the just cut and paste into new projects.

From the other side (just to stay with verifiable stuff so you can check for yourself), to hack the Hubsan drone and build an autopilot took maybe 2 weeks a couple of hours a day in PIC asm and C for the PC part;

formatting link
there was a short discussion with somebody from Germany in the drone group about the secret format on the board test point I used to grab the data and hack it. By the time he got back I had already cracked it. and that includes the electronic design to actually fly the thing. As the thing is still in one piece I think bugs, if at all present, are not an issue. I know the limitations, but that is an other issue. Not done much flying lately, gov created a no fly zone as I am close to the mil airport and they know I was targeting the new F35.. ;-) Those fly over once a day to see I still behave I guess., Pity you cannot hear anything when that happens, not even with Sennheiser HD201 headphones on.

0) know the hardware 1) know how to write code 2) know in depth about what you code for.

With any of these 3 missing the result will not be optimal, probably bloat, power sucking, slow booting ... what have you.

It is not so difficult to write down some instructions to a machine do this then do that.. But if YOU have no clue then the machine following those instructions will not work right either.

It, programming, is like learning a language, any language. But to be able to say Hello World ? Or write a novel? Or even an instruction book, bomb defusing manual:

1) turn big bolt 90 degrees left 2) before you do that pull pin. got it?
Reply to
Jan Panteltje

and it all comes from the same harddrive

Reply to
Lasse Langwadt Christensen

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.