OT: American's are yellow-bellied pansies...

SONY should just hire an assasin and take that little bastard out. Problem solved.

In my view, having someone on your ass who has nearly unlimited resources at their disposal, and the willingness to see their objectives through - makes your life a living hell.

Reply to
mpm
Loading thread data ...

They could put it out on Wikileaks for instance.

Which comes back to Jim's original point.

That decision will probably come back to haunt us. S Korean nuclear plant is the latest thing to be targeted with cyber ransom demands.

The problem is that IT security isn't taken seriously in many large companies. Too many corporates run AV software that is way below state of the art and their employees tend to just click on stuff that gets through the firewall and malware scans without thinking.

One thing that the Sony debacle has shown is how quick the USA is to blame the North Koreans with only scant preliminary evidence. It is quite likely that they were behind it, but it is by no means certain. Sony BMG unleashed a dangerous Rootkit malware into the world on a whole bunch of their advanced "copy protected" CDs in the past.

formatting link

There are plenty of grey hat hackers and security experts that think that Sony well deserve their comeuppance for that incredibly crass blunder. They didn't take cyber security seriously and got burned.

I suspect that the people behind this attack have chosen to do it in such a way as to implicate the North Koreans but may well be other actors entirely using an obvious misdirection to evade detection.

If the security analysts waste their time trying to prove that it was North Korea by the time they start looking in the right direction with an open mind it will be too late and the logs will be overwritten.

--
Regards, 
Martin Brown
Reply to
Martin Brown

Martin, you make some really good points.

I would just add that, in my experience, "I.T. Professionals" are often, at best, pseudo-engineers who do not have a full appreciation for many of the core security functions they are entrusted to administer.

In places I have worked in the past, I've seen "I.T. Professionals" roll-ou t platforms and programs without a full understanding of their unintended d ownstream effects, including at times, crippling department or company prod uctivity.

I also realize that the job is nearly impossible at a bit level - trying to watch every new piece of code that comes in or out of the network.

I've pretty much always had low regard for all but the most experienced I.T . person. To me, it's the janatorial work of real engineering. (My apolog ies to those I.T. folks in the top 5-10% of their field.) All these recent network hacks simply reinforce that position since I would expect about th e same level of network "security" if it were indeed run by the same guy wh o mops the floors.

By the way: Same opinion whether we're talking large company or small.

Reply to
mpm

Sony may have had an insider or chosen crappy passwords, but there's a decent chance that either Microsoft's or our government's Windows, BIOS, & encryption back doors caused it. Even if not, it's certainly available to anyone willing.

It's simply pathetic that a computer would EVER download and run an executable without your permission--viruses shouldn't be possible.

Our own government's spying on us compromises our safety.

Cheers, James Arthur

Reply to
dagmargoodboat

Does any Microsoft product need a back door?

Heck, Microsoft allowed viruses to be in JPEG files. And in emails that weren't even opened.

The "modern" OS is a horror. We need something new.

--

John Larkin         Highland Technology, Inc 
picosecond timing   laser drivers and controllers 

jlarkin att highlandtechnology dott com 
http://www.highlandtechnology.com
Reply to
John Larkin

Bunch of DUMB clucks...

Reply to
Robert Baer

!!!CHECK!!! Take three (no more than three) good programmers give them a spec to shoot for and have them write the code 100% from scratch. In assembly (macros allowed). NO communication from the outside world allowed (ie: no micro-management, no added "suggestions = = product drift). Result will kick ass in speed and size.

It takes little effort to prevent "buffer overflow", and some effort (but not a lot) to prevent code from writing into data areas.

Better yet, use a DIFFERENT CPU: 1) 100 percent LINEAR addressing, 2) separate code and data memories. TransMeta had a good idea.

Reply to
Robert Baer

It is the sheer complexity of modern software that makes it so easy for systems to be subverted by those with malevolent intent. It is a lot harder on a Harvard architecture to confuse code and data (but you still need a gateway to load programs into execution memory).

The noexecute flag helps a bit but people have found ways around it.

Too much of the corporate web relies on complex bloated and insecure animations like Flash, JavaScript and its ilk to be realistically secure. If you want a truly secure system it has to be very tightly organised with only a tiny core executing with privileges.

Much of the problems stem from buffer overrun attacks which the present generation of development tools and developers are lax about checking.

Not really relevant. The problem is that to make 3D games run faster there are too many components running at too high a privilege level. This combined with the fact that major players do not properly digitally sign their own driver distributions means that the average punter has no idea when a popup appears saying "this site needs Flash version x.yz upgrade now ? that they are downloading some evil Trojan". They are already conditioned to download incorrectly signed drivers by the manufacturers sloppy distribution of unsigned updates.

Most systems get compromised these days by user error of some sort.

Spearphishing is fantastically more effective than generic attacks.

--
Regards, 
Martin Brown
Reply to
Martin Brown

That is a rather naive way of describing it. JPEG files contain elements that are supposed to declare their true length. Someone found a way by fibbing about the length of a particular object to write code to modify the return address on the stack and relying on the relation of that buffer to the stack transfer execution to arbitrary code.

Mickeysoft in their infinite wisdom decided that the rendering engine should parse an email when you put the cursor on it. The result was that you didn't need to open the email for a malign JPEG to execute.

There is an arms race going on between the developers and the hackers and the latter always have the advantage since the former are working to ship it and be damned deadlines. Some mistakes are inevitable.

Trouble is people like fast 3D gaming and the compromise made has been insecure OSs. It you want a truly secure OS then it has to be paranoid and assume that everything is out to get it. There is a price to pay and that is a slower response (having said that modern machines are now so fast that this would be realistic and might become necessary).

Maintaining backwards compatibility also makes it very hard for people to introduce new robust operating systems to the mass market.

--
Regards, 
Martin Brown
Reply to
Martin Brown

I've long advocated a multi-CPU architecture with the OS running on a master CPU with full hardware-enforced task priviliges, preferably one CPU per task. Sure, use linear addressing with hard I/D/Stack space separation, no virtual memory.

A multicore ARM would do, with the proper protections.

Windows and Linux are un-repairable hairballs. C, as used today, is too hazardous. Sony-like events will become more common and more expensive. Whole power grids and whole governments and huge refineries will come down next.

--

John Larkin         Highland Technology, Inc 
picosecond timing   laser drivers and controllers 

jlarkin att highlandtechnology dott com 
http://www.highlandtechnology.com
Reply to
John Larkin

Unchecked buffer hazards are indeed naive. And ludicrously common. That's what happens when you mix I and D and Stack spaces carelessly, without hardware protection. C is prectically designed to create buffer hazards.

Sure, programmers need to be careful. And hardware architectures should make it impossible to execute data. Intel and Microsoft were the naive ones, out of the mainstream of OS design from the very beginning.

--

John Larkin         Highland Technology, Inc 
picosecond timing   laser drivers and controllers 

jlarkin att highlandtechnology dott com 
http://www.highlandtechnology.com
Reply to
John Larkin

PDP11 and VAX had proper memory management hardware. They would trap if anyone confused I or D or stack space, or addressed outside declared regions.

--

John Larkin         Highland Technology, Inc 
picosecond timing   laser drivers and controllers 

jlarkin att highlandtechnology dott com 
http://www.highlandtechnology.com
Reply to
John Larkin

Microsoft recently put out an emergency patch for a Windows crack that has been there since the '80's. That's not due to the complexity of modern software. Nor was the SSL nonsense put in there courtesy of Peeping Sam.

It's not that complicated--just don't download and run other people's code!

Those steps are only needed if you're going to let someone else run code on your machine, a dangerous proposition.

None of that really matters if a police state can intercept your encrypted traffic and decipher it, thanks to hooks inserted by your own police state.

Viewing a Flash animation, video, or playing audio shouldn't allow someone to run code on my machine. Ever.

Cheers, James Arthur

Reply to
dagmargoodboat

Like Microsoft's!

Windows and IE and Word have had hundreds, or thousands, of buffer overflow bugs. A Word doc or a web page can exploit them.

Why don't they get it? Don't mix code and data.

--

John Larkin         Highland Technology, Inc 
picosecond timing   laser drivers and controllers 

jlarkin att highlandtechnology dott com 
http://www.highlandtechnology.com
Reply to
John Larkin

+1

That would be nice, but requires new hardware. The problems vanish if they simply bounds-check at runtime, so why the heck don't they? That wouldn't break anything (it's fully backwards-compatible) and doesn't need any new hardware.

Some years ago I nearly bought a software engineering book by one of the Microserfs. It was full of sound debugging stuff, including #defs that added runtime bounds-checking on C programs.

What jerks.

Cheers, James Arthur

Reply to
dagmargoodboat

That's not just unlikely, it's extremely unlikely. First of all, you just cast this O/S in the role of a public good, so just getting all the people who ...matter to agree on it will take decades. The very vectors for Heartbleed were done very much in the spirit of "software as a public good" and the lack of price as a signal to sail by , IMO, led to this problem.

You can make wither Windows or linux relatively "safe". You just have to turn off all communication to the outside world. Perhaps not

*all*, but constrain it severely.

I say that; something like BeOS may be ( or may not be ) more secure. But meh; keep up your malware tools and keep good backups.

Events like the Sony event are the modern equivalent of seafaring piracy. The answer is ropes over yardarms, or heads on pikes outside the city gates.

--
Les Cargill
Reply to
Les Cargill

It's also ridiculously easy to *not do that*. I've been *not doing that* for decades. the number of entry points in the 'C' standard library that put you at risk is fairly small. Even those can be managed with a sequence of "don't" rules.

Do you run outside wires direct & unbuffered to a PIO pin? Probably not. Do you leave things unterminated? Probably not.

You could, but you don't. The same is true of software. It ain't rocket surgery. It's just discipline.

Not so much. One man's data is another man's program.

While that's true, it's also true that they were the right solution for mass-market computing.

--
Les Cargill
Reply to
Les Cargill

Unless I miss my guess, a Harvard architecture will never meet modern computing requirements for this very reason.

And that's simple malpractice.

And that's because they have no idea. This being said, unsigned drivers are merely a *potential* weakness and the vast majority cause no problems.

--
Les Cargill
Reply to
Les Cargill

yeh, will always be a problem. at some point when you compile or download a program you have to change bytes from being data to being code

And the signing just moves trust to someone else, the latest FTDI drivers that broke "fake" hardware were signed, and there is always the risk that someone steals the keys so they can sign

A lot of virus isn't really virus but people who answer yes to stuff they haven't read or didn't understand.

-Lasse

Reply to
Lasse Langwadt Christensen

Sony is a Japanese company.

--
Anyone wanting to run for any political office in the US should have to 
have a DD214, and a honorable discharge.
Reply to
Michael A. Terrell

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.