OT: UK okays warrantless remote hacking of PCs

It only makes them *harder to find*, not generally harder to use: You just need one clever guy to package up the exploit nicely, and the next thing you know, millions of script kiddies start using it.

The fact that Internet browser exploits are relatively common -- and HTML isn't a "programming" language at all! (although I suppose Javascript is, just barely) -- suggests to me that the onus for writing good quality code is still much more a function of the individual writing that code rather than the particular language they use: I wouldn't inherently trust a guy writing nuclear reactor control code any more if he's using Java than if he's using C!

C++ allows one to have all those "niceties" (safe/smart pointers, arrays, typechecking, etc.) while still retaining a nice, clean interface -- it's just that, unlike "safe" languages, you're not being *forced* to use them if you don't want to. (In C you can have it as well, although the code starts to look awfully ugly.)

Globally it might not, but in your own company it can... let your competitors have the chaff. :-)

I don't think that's true at all, but perhaps I'm just not experienced?/jaded? enough? Getting really good people is always a problem, but they are out there -- I think that most companies are just too willing to accept "average" as "good enough" -- as one can point to companies where almost every single person is noticeably above average in their fields, after all. (For a large-scale example of this, just look at HP and Tektronix back in the '60s/'70s -- you had to be a *particularly* sharp cookie to get into those places at that time. ...and HP's first calculator, the HP-35, had a couple of relatively minor bugs, whereas the current HP-35s had a whole boatload, despite the engineers working on it having *much* better programming/debugging tools than were available in 1972! Oh, and because I like to harp on this... every registered owner of the original HP-35 was sent a letter informing them of the bugs and offering to exchange the calculator free of charge if the owner desired. With the HP-35s, HP *doesn't even have a list of bugs posted on their web site*, much less offering replacements.)

It's certainly one way, just not the only way IMO.

OK, you make a lot of good points, and in general there's something to be said for using "safer" languages so long as using them doesn't hamper productivity (...when comparing outcomes wherein the same level of security was achieved in all cases). To me this is sort of like "pair programming" where you take two guys and they code along, one looking out for the bugs the other one might be creating: Sure, it certainly will reduce the number of bugs and may occasionally result in more efficient code too... but rather than taking two "average" guys making, e.g., $50k/year, I'd much rather have one "outstanding" guy and pay him $75k a year and expect the same results, and everyone wins.

---Joel

Reply to
Joel Koltner
Loading thread data ...

I'd lay odds that most HTML exploits are of some form of buffer overflow/use of freed memory error in the code.

Well, it's a nice idea. But try telling the company owner that this or that project can't start because we've yet to find the above average staff required to man it. The owner is likely to think that we could be less selective, and get the project underway.

Now, of course, if I owned my own company, things would be different. But then, a false proposition implies any proposition.

Actually, it probably improves productivity. An awful lot of time can be spent tracking down faults caused by pointer misuse, array overflows and use of freed memory. I remember one instance (not perpetrated by me) where data in memory was being corrupted by an instruction that had itself been corrupted by a misused pointed. You can imagine how long that took to resolve.

The main down side of safe languages is one of performance. But that really isn't an issue these days, provided the software is designed to scale properly across multiple systems (it usually isn't but that's a story for another day).

Yes, but employers rairly appreciate the difference between the value of the $50K employeer and the $75K one. If you're the employer, fine. But most people trying to develop software have to manage with the resources they've given.

Sylvia.

Reply to
Sylvia Else

dangerous,

A good computer language should discourage tricky constructs, cleverness, coding speed, and coding efficiency. It should be plodding, tedious, constrained, and wordy. It should positively chase away people who want to play programming games but don't care about applications.

Cobol comes to mind.

Cobol was, of course, invented by Grace Hopper. Someone here recently observed that English majors may make better programmers than CS majors. There may be something going on here; the classic male geek empathy-free visual-spatial game playing mentality may not be best way to get good code.

John

Reply to
John Larkin

It would be from hard to impossible to write many modern applications in COBOL. The fact that COBOL compilers are not written in COBOL says something.

Sylvia.

Reply to
Sylvia Else

just

the

said

What happened to i/d space separation? Even in the early 1970's you could buy a minicomputer that had hardware-enforced page attributes; it was *impossible* for code to modify code, or to overflow a stack without a trap, or to execute data.

Even compilers for Intel crud should be able to position stuff so that stacks don't overflow into code.

John

Reply to
John Larkin
[snip]

My oldest son is multi-lingual... and writes good code. I've always associated good language skills with good software.

...Jim Thompson

--
| James E.Thompson, P.E.                           |    mens     |
| Analog Innovations, Inc.                         |     et      |
| Analog/Mixed-Signal ASIC\'s and Discrete Systems  |    manus    |
| Phoenix, Arizona  85048    Skype: Contacts Only  |             |
| Voice:(480)460-2350  Fax: Available upon request |  Brass Rat  |
| E-mail Icon at http://www.analog-innovations.com |    1962     |
             
 I love to cook with wine     Sometimes I even put it in the food
Reply to
Jim Thompson

And that's almost the reverse of the kind of people who program.

Programming is, after all, writing.

John

Reply to
John Larkin

just

the

C!

competitors

said

productivity

It appears to have been an idea that Intel felt no need to adopt until the issue of viruses arose. Their processors were after all designed in an era when malicious code was essentially non-existent.

It's not so much stacks overflowing into code, as programs writing to data in a way that corrupts the return address on the stack. If the processor is willing to execute data, then the return address on the stack can be made to point to some data. If that data has been provided by an attacker - e.g. it might be part of what is meant to be a URL, or an image, or something - then the attack succeeds. A fairly common way in which this occurred was for an engineer to allocate a buffer on the stack that was "bound to be big enough", and then not checking that it actually was for the particular data being processed.

Of course, this is only one class of attack. SQL insertion attacks (caused by naive processing of data used in SQL statements), directory traversal, and cross-site scripting attacks are unrelated, both to it, and to each other, and there are certainly yet more classes.

Sylvia.

Reply to
Sylvia Else

Over time, the accessibility to becoming a programmer has broadened a great deal and in the process the 'average' skill has likewise plummeted. (Ph.D.s once represented the average, in days I personally remember, but that would distract my point.) So the "average ability" target isn't static, but continues a downward direction lowering the floor you are shooting for.

When I started teaching at the largest University in my state as an adjunct prof, 15 years ago or so, I was shocked by one of the students coming to me during her 3rd undergrad year and telling me, "I'm not sure I should have decided on a CS degree. I had been thinking about accounting before, but decided programming would be 'less stress' and still good pay for it." She wanted my thoughts. Shocked the first time it happened. But I stopped being shocked as it became a vulgar occurance. Back when I was studying, one didn't wonder if accounting was an option. You KNEW that wasn't even a remote possibility and it didn't even enter your mind.

Today, that has entirely been turned on its head and most anyone can consider programming as an 8-5 occupation they trivially leave at the business door when they head on home.

Jon

Reply to
Jon Kirwan

When I was doing my CS degree course, I wondered at the people doing joint honours accounting and computer science. The mind sets required for the two disciplines seems so far removed from each other that I couldn't help feeling that anyone who chose that course must have made a mistake, one way or the other.

Sylvia.

Reply to
Sylvia Else

re

just

ion

ces

you

or

's not

fe

It is very hard to execute data on a Harvard architecture machine. Languages that avoid pointers but support rich structured data types can avoid most of the C pitfalls of over running array bounds and trampling return addresses with a pointer to malicious code. C/C++ on a flat unprotected memory space is about as bad as it gets in terms of making exploits really easy.

Although people don't like them these days segmented architecture memory with permissions is an excellent defence against illicit hostile code being put into a memory segment and then executed. The belated NOEXECUTE flag is a fine example of shutting the stable door on Wintel machines when there are still big holes in the walls.

rote

ing

he

just

This definition would appear to include most of the MS development team.

rous,

are

The situation is more subtle than that. Even top programmers can make fence post errors and minor typos that have disastrous consequences in weakly typed languages where almost enything wil compile without errors and with the mentality that anything is a pointer or can be pointed at tiny errors can be devatating. The pointers for everything in C makes it extremely vulnerable even with excellent engineers.

There are better languages that could be used for safety critical applications. Modula2 was one of the simplest robust ones that had a brief acceptance in some fields, and Ada is another heavyweight (although a bit OTT for my tastes).

Using languages and development tools that can cover for the deficiencies of human developers makes sense. CPU cycles are cheap and getting cheaper - having the compiler or static analysis tools work that bit harder to weed out bugs at compile time is well worthwhile. Who really wants to see BSODs or have Vista lock up on them yet again?

Regards, Martin Brown

Reply to
Martin Brown

be said

ctivity

It is possible on the Intel architecture - OS/2 used that model. There is a small hit on changing privilige ring. Unfortunately the abysmal IBM marketting droids so completely cocked up selling OS/2 that MS Windows won the day. 386, 486 and Pentium are quite capable of running with hardware enforced page attributes (although there are gotchas). That said legacy h/w issues mean that even some Unix systems on Intel CPUs are not immune to sophisticated exploits eg.

formatting link

That isn't how it is done. The basic trick is that a fixed length buffer on the stack is overflowed in such a way as to deliberately overwrite the return address with a pointer to code you want executed and the code itself. It isn't the compilers fault that C allows and even encourages sloppy programming practices in the name of efficiency.

Certain MS data structures are prone to being utilitised in this way because they contain a "how long I am field" you only need to find a structure where no matter what value is in the data the system allocates the length that these usually fixed length objects always have and you are in. The deadly ones are where the memory is allocated fixed length but the data is copied using the value in the dataheader (which is telling fibs) or until the next null byte. A variant of this method has been used to break the MS JPEG codec.

Belatedly a no-execute flag has been added for data segments, since it seems impossible to train the monkeys at Microsoft not to write insecure code.

Regards, Martin Brown

Reply to
Martin Brown

d

f "just

Not just novice/inexperienced programmers - and that is the *big* problem. Our software industry prefers shortest time to market over reliability - after all you can just issue updates and security fixes on a daily basis.

ngerous,

rs are

d."

Not quite. It isn't verbosity that is a good thing. It is opaque cryptic terseness that is bad.

It should be precise enough that you have to write exactly what you mean and if it is at all ambiguous you get a warning or an error message and not have the compiler taking a wild guess about your intentions. What you don't want to have are obscure unintuitive operator precedence or implicit coercions between data types.

The distinction between interpretting a bit pattern as an address, an integer or a floating point number needs considerable attention for low level OS work. Do you mean the value of this floating point number rounded to an integer (which might overflow if it is too big) or put the binary representation of this floating point number into a 32bit integer.

I sometimes think it would have been better if addresses were some other length than 32 bits so that the interchangable use of "integers" and "pointers" would not have gained such wide acceptance in C.

You are barking up the wrong tree. Any of the strongly typed languages where you have to say exactly what you intend weed out a lot of the classic C faults at compile time - and the optional bounds checking on arrays can catch quite a lot of common fence post errors at runtime too.

You can even program defensively in C or C++ but so few people do. Good linguists can make very good programmers although they tend to graduate to requirements analysis fairly quickly.

Regards, Martin Brown

Reply to
Martin Brown

Anyone can google "software failure" and see what a disaster current programming languages and culture are. This mess not only costs society maybe a trillion dollars a year, it is a major threat to national security.

It's obscene that a practice with such abysmal results - 50% failure rates for large projects, programs with thousands of bugs - should be graced with academic titles like "computer science" and "software engineering." No real science would be allowed such slop, climatology excepted.

John

Reply to
John Larkin

Makes sense to me. Who else could correctly program a complex financial system? A programmer with no skills or interest in accounting?

The best programmers are people who are experts in their application and also know how to program. The best two programmers I know (modestly excluding myself) were chemistry and physics majors who taught themselves to program.

Reminds me of an actual case: during WWII, it was noted that aerial gunners weren't very good. So at one training camp, just before the final live-shooting exam, they switched the classes and had the cooks and bakers class take the gunnery test. They did better than the graduates of the gunnery school.

John

Reply to
John Larkin

Not to want to answer for Sylvia but in my comments prior to hers (not just those quoted in her reply), I was talking about students who were finding CS impossible going for themselves. There is no problem with someone being competent in multiple disciplines and enjoying it, to boot. On that score, I completely agree with you about people who care about knowing an application space cold and treating programming, in part, as merely an enabling or expanding skill.

Often, that is the case. The overlap also helps ensure that things don't "fall through the cracks" between various project people's specialties. Sometimes, people don't want to extend themselves much outsite their specialties and when it comes down to it, the programmer is often the last person putting all the sensor physics, mathematics, numerical methods, understanding of electronic systems behaviors, etc., together. And if they aren't the type of person who enjoys getting into multiple disciplines, they may actually retreat to blaming others for not conveying sufficient details with sufficient hand-holding rather than taking some responsibility for making sure they have what they need.

I'm a physics major. So who am I to disagree? ;)

I don't know exactly how that applies to your earlier point, but it sounds like a good story for something.

Jon

Reply to
Jon Kirwan

Hi Sylvia,

Yep, I'd agree.

Well, you do also have *some* choice in where you work... particularly if you're one of those above-average sorts... although I realize that the quality of employer you end up with always involves a certain amount chance, and that changing employers does become more difficult as one gets older and has a spouse, kids, etc.

(I have a suspicion that the average employee quality at any given company is relatively uncorrelated with the average employee quality of the entire workforce... companies with well-above-average employees tend to be quite discriminating in who they hire, whereas those with below-average employees often exert effort to keep the "bright guys" down as well...)

Yes, also agree... most software today is waiting on hard drives or the Internet and not the CPU so much.

Ding ding ding! Yep, very true.

---Joel

Reply to
Joel Koltner

I think of it more as just a historical artifact... when Intel was designing the 8088, transistor cost was such that the additional logic needed for I/D separation was non-negligible, and as the 8088 morphed into the

286/386/486/586, no one bothered to take a look at some of the core design criteria... until virii came back to bite them, as you point out.

---Joel

Reply to
Joel Koltner

Yes.

I don't personally see much correlation between what degrees someone holds and how good of a programmer they are. Using the "linguistics" tie that some people here are mentioning, I think of it as the same low correlation between whether or not someone has an English degree and their success at writing books... or if you have a Music degree and becoming a superstar.

I'd guesstimate that at least 2/3 if not 3/4 of those entering college today are looking for a degree that will provide a good paying job with little stress. This is what you get in a society that now says everyone is "supposed" to have a college degree, even if their idea of a fun job is something more in line with being a millwright or a longshoreman.

---Joel

Reply to
Joel Koltner

I agree, Harvard architecture machines do provide a legitimate benefit there. I'm perhaps a bit short-sighted on this as I grew up on assembly/C/C++ von Newumann architectures.

Although people don't like them these days segmented architecture memory with permissions is an excellent defence against illicit hostile code being put into a memory segment and then executed. The belated NOEXECUTE flag is a fine example of shutting the stable door on Wintel machines when there are still big holes in the walls.

Haha... well, as a percentage of total developers at MS, I'd wager you're correct -- even though I have absolutely zero concrete data in the matter. :-)

I haven't used Ada, although I have used the somewhat similar VHDL, and I'd have to say... it's really rather annoying! When your compiler can tell you quite specifically what syntactical errors you're making ("...this should have been 'begin function' and not 'begin architecture' or somesuch...), it's a sign that your language is overly wordy.

---Joel

Reply to
Joel Koltner

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.