nightmare

."

depends how you define wrong. Plenty of people go to McDs when they could h ave a better burger a few doors away

?

quality, price, ethics, the usual restaurant things

People have found other ways to do the same thing en masse, thus the existe nce of linux, ios, android etc. Sadly android also stinks. Ios works but is more limited.

.

I'm not so sure. If Windows were debugged as it went, there's be far less o ther OS users around, they'd sell more product.

Windows really isn't good enough, it causes a mass of problems that cause e nd users more cost than debugging it in the first place. That's precisely w hy so many aren't using windows today. Had MS taken bugs seriously all alon g, Win would have not have fallen in market share the way it has and contin ues to.

NT

Reply to
tabbypurr
Loading thread data ...

I don't see why. It takes a whole class of potentially-reliable exploits and renders them essentially impossible. Your immune system has a similar type of defense-in-depth.

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs 
Principal Consultant 
ElectroOptical Innovations LLC / Hobbs ElectroOptics 
Optics, Electro-optics, Photonics, Analog Electronics 
Briarcliff Manor NY 10510 

http://electrooptical.net 
http://hobbs-eo.com
Reply to
Phil Hobbs

Martin Brown wrote in news:qkp6q2$13rd$ snipped-for-privacy@gioia.aioe.org:

I made hardware IP encryption modems for the military. Devices required on both ends.

NOBODY without authorization gets in, and NOBODY gets to read the data packets to any fruition.

OS/2 died because MicroSoft screwed them when they reneged on their agreement to provide the full WIN32 API to them. The same demise befell Quarterdeck's DesqViewX.

OS/2 was far superior and by now would have easily supplanted all Linux and UNIX based OSes and definitely would have killed off Windows poor operating model and subsequent OS. IBM is the best after all.

However the PS/2 thing was their biggest mistake at the hardware level as well as for OS/2.

They were probably trying to cut down on via count or such with MCA. Hardware did not get the same growth path that chips enjoyed, minitiarization-wise. Now, it has caught up quality wise as folks no longer worry about things like via count. Look how long thin gold plated wires play well with the ethernet spec. Billions of connections around the world no longer worrying about silly oxidations and such. I can remember when 'fixing' a 'broken down' Pac-Man Upright video game many times merely involved taking the edge card connector off and on a few times to put a fresh burnish on the fingers. Back up and runnin'. Now we have chips with over 2000 "faces" on the chip that all get reliably connected to. Amazing!

Reply to
DecadentLinuxUserNumeroUno

Proper hardware protection would allow *any* loaded image to run safely.

With proper memory management, buffer overflow exploits should be impossible.

The problem isn't syntax, it's the way c compilers manage code, data, stacks, buffers, pointers, libraries. They generally dump all of them into one priviliged space, where anything can be executed.

The PDP-11 and the 68K had proper hardware protections. Intel did not.

RISC-V seems to have sensible memory management. A multiprocessor version could be absolutely safe, if anyone cared to implement it.

Reply to
jlarkin

Dorian is now headed about 140 degrees away from the path that most models predicted a week ago.

Reply to
jlarkin

My understanding is it has to do with an unexpected blast of hot air from the west coast.

--

  Rick C. 

  ++ Get 1,000 miles of free Supercharging 
  ++ Tesla referral code - https://ts.la/richard11209
Reply to
Rick C

von neumann architecture was a compromise to make computation cheaper with the vacuum tubes and relays available at the time that has persisted to the present day

Reply to
bitrex

Like if you don't want to write self-modifying code (one of the biggest security threats there is) why in the world should executable code and runtime data share the same memory space

Reply to
bitrex

That is tautology. Even the caching hardware on some modern CPUs can be subverted to allow an aggressor to infer the values of data.

They were virtually impossible in the segmented model of OS/2.

Your data segment came with a length and you could trash it but step over the permitted bounds and your process would be unceremoniously terminated.

No they are all in user space but because an integer can be coerced to be a pointer to any damn thing you like and flat memory models have become the modern default there is scope for tremendous abuse.

I have used an OS/2 C compiler in the 1990's that did put user data memory in separate data segments with lengths that you got to decide at link time. The one advantage of putting stack at one end of a big chunk of memory and heap at the other is that you can make best use of both. The compiler itself had other problems and never went mainstream.

The problem with C is mostly down to copying nul terminated strings that have been maliciously constructed to trample over something important.

Having distinct code segments that are permitted to execute and never permitting data segment execution goes a long way to stopping exploits, but you cannot prevent someone from tweaking a return address on the stack if they have sufficient knowledge of where and how to do it.

We are where we are and fixes to the Windoze OS now have a bad tendency to break anti-virus products and other bank security add-ons.

The Intel 386 CPU onwards had quite respectable hardware segmentation but Microsoft's operating system developers chose not to use it.

formatting link

IBM's people got it about right with OS/2 but that proved to be a dead end like Betamax - technically superior but lousy marketing.

You can certainly make it much harder for people and code to misbehave with dedicated hardware assistance. But there is usually always a chink somewhere that allows people to do things that they ought not to.

The aggressor always has the option of code inspection of the OS and a great deal of trial and error to perfect their attack.

--
Regards, 
Martin Brown
Reply to
Martin Brown

You don't need a Harvard architecture to prevent executing data, just proper memory management hardware. Of course, you have to use it.

Reply to
John Larkin

Why is that meaningful? The storm is over a hundred miles wide, takes over a day to move its diameter, and the 'heading' is just an imaginary center point doing a moment-by-moment movement.

If you wish to imagine that center doing a 100 yard circular orbit inside the storm, you can easily contrive the 'heading' to change 180 degrees in a few hours, but that doesn't say which folk are getting wet. It's an irrelevance.

The storm's position today is not notably different from the estimate two days ago.

Reply to
whit3rd

o!

A mid '80s Vital Industries 'Squeeze Zoom' video effects system did real ti me video effects with a Z80B and 768KB of RAM in broadcast quality System M color. The edit controllers for the Sony U-max (3/4") and the 2" Reel to R eel VTRs used 6502 processors. Those MAC computers were always crap, for re al work. The Commodore Amiga with the external Video Toaster hardware was m uch better, and cheaper.

Reply to
Michael Terrell

If you are faced with life or death, it's relevant.

The time lapse for weather predictions to degrade to zero accuracy seems to be 5 to maybe 7 days. Here on the West coast, it's usually less.

Reply to
John Larkin

Budget, budget. small colleges weren't and still aren't drowning in Federal funds. I think there might have been one Real Professional video editing machine in the lab I don't recall. It might have cost 10 or 15 grand, new.

It's a tough swing to even have one because if you blow all the dough on the film department's gear the comp sci and student library dept/e-mail dept start bitchin' that they're still using 486DXes from '91 and what the hell is going on?

Reply to
bitrex

The early Quadras particularly had some problems:

Reply to
bitrex

Mac Color Classics and 486DXes were still being used as workhorse machines for email and word-processing at my school well into the late

1990s, early 2000s probably.
Reply to
bitrex

over

r point doing

e
n

irrelevance.

o days ago.

Some people have quite a remarkable grasp of the obvious, although his data is not actually correct. More times than not a week out forecast is relat ively accurate although you need to not only have error bars on the amount of rain, sun, temperature but also time. If the week out forecast is for t he hurricane to make landfall in your backyard on Tuesday and it doesn't co me until Wednesday, that's not a failure of prediction in my book although obviously the pedantic might consider it such.

--

  Rick C. 

  -+- Get 1,000 miles of free Supercharging 
  -+- Tesla referral code - https://ts.la/richard11209
Reply to
Rick C

And yet..there is such a thing as a totally bug-free program. Tom Pittman wrote a BASIC interpreter for the RCA COSMAC and it was bug-free. Then RCA bought it and compromised it (Tom's version was still OK).

Moral? Your customer may become your worst enemy.

Reply to
Robert Baer

Does anyone still use those machines for anything?

Reply to
bitrex

Even the Space Shuttle may software probably (almost surely) had bugs. Just never had enough uptime to encounter them. They didn't make a four-way computer voting system cuz they thought it would be economical to try to make it totally bug-free vs. using four (five actually) GPCs.

Reply to
bitrex

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.