OT: UK okays warrantless remote hacking of PCs

I would have thought they have, simply because otherwise they'd have create major obstacles to upgrades. Many MSDOS programs will run happily under Windows XP - even those that think they're directly controlling a sound card.

Sylvia.

Reply to
Sylvia Else
Loading thread data ...

Yep; I believe that was covered in the remainder of the sentence:

I can't really think of anyone with pockets deep enough to really worry Microsoft. The DoJ doesn't seem to have troubled them much, and they only seem to worry about the EU insofar as they'll catch flak from the rest of the Fortune500 if they're seen as triggering further regulation.

More significantly, I'm not so sure that you can argue that any specific fault in C/C++ software amounts to "negligence" simply for using C/C++ either.

Sure, a higher-level language might have caught a particular bug, but it might also have introduced a few of its own. Higher-level languages often have quite complex semantics, while C's are rather simple (although sometimes counter-intuitive if you don't have a background in digital logic).

PHP doesn't have buffer overruns, but it's infamous for injection attacks, XSS, CSRF etc. In the same way that leaving bounds checking to the programmer is inviting insufficient bounds checking, requiring programmers to perform code (SQL, HTML, XML, /bin/sh, etc) generation using generic string manipulation operations (rather than operating upon a tree via e.g. DOM) is inviting injection attacks.

And most dynamic/interpreted languages will only perform the most rudimentary static analysis (i.e. parsing), so you may miss quite blatant bugs if you fail to test a specific branch. Errors in code which handling errors (particularly obscure or awkward ones) is quite often lacking in this area.

OTOH, compiled languages will generally catch code which is never going to work, particularly if you have a rich type system like Haskell. This is a long way from formal verification, but it's still better than you get from most dynamic languages.

Then there's the Law Of Unintended Consequences. Reducing the minimum skill level has a tendency to reduce the average skill level as well. Although I quite like Python as a language, the quality of many of the popular libraries (even some of those included in the core package) is pretty dire compared to similar libraries written in C, C++, Fortran, and the like.

Actually, straying too far from the herd can increase the risk, not just the risk of liability, but the risk of creating the defect in the first place. If you're using a popular language, both its pitfalls and their solutions become known more widely and more quickly, it's easier to obtain skilled workers, easier to obtain training, compiler bugs get discovered faster, etc.

And I probably still write most of my code in C. C++ if the OO abstraction is likely to be fundamental to the design, but otherwise it's not worth the trouble. Lisp or Python if I need a dynamic language (I prefer Lisp as a language, but Python is more mainstream right now). Other languages if the language has a feature which will significantly simplify the task at hand, doesn't have major obstacles, and a minority language isn't a problem (Haskell often fills this role).

I suspect that Java is probably the most likely candidate to take over from C/C++ as the new application language (what COBOL was before the industry was rebooted by PCs and the rump of the industry (mainframes) became a niche).

It doesn't let you do the really dangerous stuff (e.g. unchecked pointer manipulations), it isn't that far from C++ (but with some of the biggest pitfalls filled over), doesn't force too much BS onto you, isn't ridiculously slow, and seems to still be gaining substantial market share.

The main obstacle to Java is that it doesn't facilitate (and could even harm) Microsoft's dominance, which is about the only factor which ever matters to Microsoft. Although the same is probably true for most large companies, most companies don't get to operate like a modern-day East India Company.

So in the meantime, Microsoft is trying to make the world use C#, those who haven't been assimilated are either sticking with C/C++, migrating to Java, or are Web 2.0 developers who will continue to produce bug-ridden code until the economy recovers and their career takes an upward turn into burger-flipping.

Reply to
Nobody

Because one program works dosn't make backward compatability. Those that directly control hardware will not work.

Reply to
krw

Windows doesn't allow programs to directly control hardware; only device drivers can do that.

If the program tries to use INB/OUTB, the CPU will trap it and pass it to the OS. The software doesn't even know that it isn't getting direct hardware access.

Likewise for running DOS games in a console window. The code writes to page 0xA000 on the assumption that it's writing to VGA memory, when it's actually just writing to a block of RAM Windows has set up.

I suggest reading up on the privilege mechanisms provided by modern CPUs (286 and onward for x86) before making assertions which are both readily falsifiable (write code which accesses the "hardware" directly, then run it) and self-contradictory (if the program really is accessing the hardware directly, the OS simply doesn't come into the picture; compatibility is a hardware issue).

As for the original contention: if you want DOS compatibility beyond the level which Windows provides, then you're going to need a copy of DOS and an old PC. Because if you try to install that copy of DOS on a current system, it ain't going to happen. The Mobo/BIOS vendors don't give a damn whether DOS 6.x runs on their hardware; if the BIOS provides enough support to let you install Windows, it's done its job.

I'm not sure that Microsoft is that much worse than any other developer when it comes to compatibility. Sometimes they keep stuff which should have been drowned at birth because they can't afford to break compatibility, other times they break compatibility for no benefit to the end user simply to drive upgrade revenues.

Reply to
Nobody

Duh, ya think?

You're clearly a kid.

Reply to
krw

Programs that used sound cards were directly controlling hardware. Some of them still work under XP, because they now functioning in a limited virtual hardware environment.

Sylvia.

Reply to
Sylvia Else

PHP is a backwards step anyway, having abandoned strong typing, but even there it is easier to avoid vulnerabilities through coding standards (requiring the use of an established library of methods for such things) then it is to avoid buffer overflows and such like in C/C++. Of course, coding standards have to be enforced with code reviews, but at least the latter have a chance of finding breaches. Spotting a buffer overflow in someone else's code requires considerable work - particularly if the problem is that they've included bounds checking but got it wrong.

I would like to think that you're right, but I see little evidence of it.

I don't see an economic recovery having that effect. If anything it's more likely that existing burger flippers will migrate into software development, because managers don't realise that these people have negative value.

Sylvia.

Reply to
Sylvia Else

just

the

C!

competitors

said

productivity

Where to begin? Exploitable stack overflow is a typical architecture flaw common to all uP designed in the era. Look up the Morris worm if you do not believe me. Thanks to market dominance and requiring reasonable code compatibility the flaw cannot be cleanly removed. Thankfully with billions of transistors to spend, enforcing stack bounds in hardware becomes feasible again.

Reply to
JosephKK

The C language was originally written for PDP-8s and PDP-11s as was the original core of Unix. Both were 16 bit word oriented machines were they not?

Reply to
JosephKK

r
"

The length of a byte has never been 8-bits. There is a reason the=20 term "octet" is used.

Reply to
krw

Mathematicians generally make very good programmers. Provided that they are given a good specification they will deliver something that matches it. The tightest specification languages like VDM are closer to pure mathematics than they are to conventional computer languages. Formal specifications done right can be amenable to proof of correctness. Mathematicians are also the ones most likely to look for counter examples to show that a requirements spec is internally inconsistent and so impossible to deliver.

Computers have a very literal interpretation of their instructions (apart from hardware errors) they do exactly what their instructions tell them to whether it makes sense or not. I reckon the main purpose of computer science teaching should be educate students about important algorithms and the range of languages available.

The problem stems from woolly thinking and ambiguous natural language specifications that are riddled with hidden defects being put into production without a proper development process. People do not inspect software adequately. The cost of finding and fixing defects rises exponentially with time into a project.

Regards, Martin Brown

Reply to
Martin Brown

One of the biggest problems with management schools is that they are trying presume that all employees are equivalent and interchangeable. Even within any single special area of expertise, however broad or narrow, however mundane or specific, however general knowledge or ultra-learned, people are just not all that interchangeable. It is a fool bean counter approach, and no more.

Reply to
JosephKK

Sorry, what is VDM? I have no context for that.

Perhaps, i think it should be more about the incredible literalness of computer languages, and the differences between totally formal languages and human spoken languages. The range of languages is almost a red herring. And yes, the numerous fundamental data structures and associated algorithms should be taught as tools, not as ends in themselves.

All well known. I propose better training about the differences between typical spoken language (there is over 30 that do not have scripts even still) and formal languages with exact grammars should be part of the mix. Plus training in comparative linguistics for the computer programming oriented students.

Reply to
JosephKK

If that read "hadn't standardized yet" i might still buy off on that. Just the same i have worked with machines with word widths of 12, 15,

16, 18, 30, and 32-bits that did not really have "bytes".
Reply to
JosephKK

I suspect that is actually a mythical conundrum, please consider CS/programming as a linguistic study. See my other posts.

Reply to
JosephKK

Has never been "standardized", though many think it has. The length of an "octet", which is the term used where it matters, is always

8-bits.

There is no (and never has been a) requirement that a machine have bytes, just like there is no law that says they must be 8-bits. Word oriented machines do make text handling rather difficult though.

Reply to
krw

Certainly that it was useless being done the way it was being done. Plus the message that they needed to change the training to something better.

Reply to
JosephKK

In the area where i currently work there are excellent engineering types that have been chef, baker, groundskeeper, and others. Some have even picked fruit, picked cotton, done auto repair, and done other things on their way through life. I see a correlation between having a wide to wild variety of experience and the nurture of the best in any endeavor.

Reply to
JosephKK

You only pretend so, both current products and history prove you wrong.

Reply to
JosephKK

Try Google.

Vienna Development Method came out of IBM research in Vienna. There is an alternative called Z.

formatting link
There is a snippet of the modern dialect online in the following review article
formatting link

Z or VDM specifications are capable of formal proof of correctness. About a dozen bugs were found in the Intel 8087 numeric coprocessor when Cyrix sponsorred a full formal spec of it to make their own pin compatible chip.

I don't think that is the problem. You always have the situation where people say one thing and mean another. We are installing this new computer/internet phone system to improve our customer service == we will fire all the people who can do the job here and replace them with cheap drudges half way round the world.

OK talking about tools I can give you a *really* good example of what is fundamentally wrong with some institutions at the moment. They are churning out journeyman coders able to program small to medium sized projects on Microsoft tools. If you look at the student edition or even the *professional* edition of Visual Studio you do not get any of the code metrics, QC, profiling or testing tools that are important on large projects. Using them correctly needs to be taught at university. Bad habits are hard to break. Only the vastly overpriced developer and team editions come with the advanced toolset.

formatting link

Look under Advanced Tools to see what I mean. Now I don't expect something for nothing. Educational versions of the MS compilers should have the optimiser suitably hobbled so that it is OK for teaching but not fast enough for production code. But it should include the entire toolset that a professional developer should know how to use.

It is too easy for modest sized university projects to be hacked out of the solid without ever learning anything useful about the dynamics of larger industrial sized projects.

You still have the problem that very often the consumers or users of the new software (or at least some of the participants in the specification and design phase are not entirely stupid and know that if it works as intended they could be out of a job). That conflict of interest doesn't arise quite so often in the electronics arena.

Regards, Martin Brown

Sorry if this gets poste dmore than once Google is playing up

Reply to
Martin Brown

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.