2066-pin processors

The term "memory management" almost invariably means allocating and freeing dynamic memory. So for C, that means calls to "malloc" and "free", and for C++ it means "new", "delete", or (better) using smart pointers and containers that handle the memory management automatically. For other languages, memory management is usually handled by higher level types (lists, dictionaries, and other containers) with garbage collection.

As a proportion of security flaws in modern software, buffer overflows are a minor concern. (That does not mean they are of /no/ concern, of course - I see no excuse for buffer overflows of any sort.)

And as I have explained, C does /not/ mix up memory sections, nor would greater separation of memory sections give the slightest benefit against buffer overflow flaws, and modern systems usually already have the kind of memory access protection that you seem to think will stop buffer overflow problems.

And I would like to see pigs flying.

No, it does not "anger" me. Your inability to comprehend reality frustrates me. Your continual misinterpretation and misrepresentation of what I write irritates me.

So in yet another attempt to set you straight, let me be clear here - I want a reduction in the flaws in software and other systems, and an improvement in their quality. That applies across the board - security flaws are just bugs that get exploited. Work on reducing the rate of errors (in design and implementation), and security improves.

But I do not want to waste time and resources on chasing the impossible. Aim to avoid or fix all the bugs you can reasonably do within time and resource budget. If you are not satisfied until you are guaranteed to have zero bugs in a system of reasonable complexity, you will never be satisfied - and that helps no one.

Yes, I have. You just haven't been interested in reading what I wrote. Maybe you'd like a chance to re-read my posts.

Testing does not prove that code is correct - it can only prove that code is incorrect. Good testing improves confidence in a system, and is an absolutely essential part of the development process. Many types of flaws will almost never be revealed during testing, however. That includes specification issues, race conditions, security against attacks, long-term problems, etc.

You can be confident that you have a solid development process that minimises the risk of flaws, and you can be confident that you have no known bugs when you release your system, and you can be confident that you have processes in place to find and fix flaws that are discovered later. You can't get better than that.

You have /almost/ understood. But as we have seen so often in your posts, you have missed a key point. You see the world in black-and-white - you want "perfect". You can't have that - it does not even make sense as a concept for real systems.

Within a very limited context, it is possible to make a design that is bug-free. And that is fine - that should be your target when you are at that stage. A VHDL/Verlilog module, or a C file, can be written without bugs. At that level, you have complete control of the system and it lives in an ideal world - there are no cosmic rays causing hiccups in the system, no glitches in the power, no unexpected inputs, no stray pointers in other code that trashes your data.

As the system complexity increases, however, it becomes harder and harder to keep such guarantees of perfection. It must then be balanced against the costs of the system - no one wants a perfect product if the development costs are sky-high and the release time is next to eternity. (At the highest levels of software quality, for the most safety critical systems, productivity for a programmer might be half a dozen lines of code in per week. Are you willing to pay that cost?)

What /I/ would like to see is less ambitious, and more realistic - I would like to see more effort put into quality, and I would like to see people being willing to accept higher prices and longer development times to achieve that. I want progress - I want things to be better. I don't want to waste time and money pointlessly chasing a mythical "perfection". Can you understand that?

Some universities do that. My university degree emphasised provably correct software development. But I understand that while such code may be bug-free, it is only part of a more complete system.

Reply to
David Brown
Loading thread data ...

Eliminate pointers. They are like giving babies loaded guns.

If the OS and compilers were properly designed, programmers would of course still write stupid code that was full of bugs, but they couldn't create security risks if they wanted to.

How much time and money are wasted on AV software and recovering from hostile exploits? On IT security consultants?

Provable correctness was a fad for a while, but seems to have died out. Programming itself is not of interest (and is in fact insulting) to many CS departments.

My concern isn't correctness, but security.

--

John Larkin         Highland Technology, Inc 

lunatic fringe electronics
Reply to
John Larkin

Crash-safe cars are doable. They'd be huge though. Fire-proof buildings are doable, all masonry construction with buried wiring, fibreglass curtains, metal furniture etc. Unsinkable ships is a taller order.

NT

Reply to
tabbypurr

But copying software is free. All you have to do is get it right once.

--

John Larkin         Highland Technology, Inc 
picosecond timing   precision measurement  

jlarkin att highlandtechnology dott com 
http://www.highlandtechnology.com
Reply to
John Larkin

If you're talking about The C Programming Language, then I have the hard copy. It's truly remarkable, the precision and clarity with which it has been written. Absolutely outstanding. Shame all reference books aren't in that class.

-- This message may be freely reproduced without limit or charge only via the Usenet protocol. Reproduction in whole or part through other protocols, whether for profit or not, is conditional upon a charge of GBP10.00 per reproduction. Publication in this manner via non-Usenet protocols constitutes acceptance of this condition.

Reply to
Cursitor Doom
[...]

Thanks, Jan. I use a similar dodge myself when bugs arise. Great minds think alike! :-)

-- This message may be freely reproduced without limit or charge only via the Usenet protocol. Reproduction in whole or part through other protocols, whether for profit or not, is conditional upon a charge of GBP10.00 per reproduction. Publication in this manner via non-Usenet protocols constitutes acceptance of this condition.

Reply to
Cursitor Doom

Eliminate babies (or at least don't let them check in code without passing a review from more senior programmers).

You can't stop security bugs by changing the OS or compilers - not unless you want to cripple the functionality of the code. Suppose you are writing a web application, where different users should have access to different data or different parts of the app. Should the OS somehow know who can access the data? Should the compiler keep track of the user names and privileges? If not, then how on earth are the OS and compiler supposed to be "properly designed" so that the app can't have security flaws?

OS's and programming languages can have different balances between flexibility, efficiency, and safety from certain types of errors. There is /never/ a single "best" tradeoff here - it is going to depend on the type of program, its needs, and the type of development process. But one thing you can be absolutely, 100% sure about - no OS and no compiler is going to eliminate all bugs in user code.

How much time and money are wasted on non-sequitors?

I fully agree that AV software should not be necessary (I don't use it myself, and never have). I fully agree that security in OS's and programs should be better. I fully agree that a lot of people using low-level, high responsibility languages like C would be better off with languages that are higher level, more limited, or sandboxed in some way.

I don't agree that an OS can have "perfect" security, or that somehow a properly designed" compiler can protect programmers from all errors.

I don't believe it has died out at all. But provable correct programming is impractical for any large scale coding.

It's the same thing. Security holes in products are either bugs in the specification for the code, or bugs in the implementation of the code.

Reply to
David Brown

No, the problem is in communication between OS infrastructure (firmware) and drivers (vendor-supplied) and OS (with frequent 'updates) and libraries (all versions in play) and development-environments that support softwares developed by teams that include computer scientists, applied mathematicians, code jockeys, system managers, and artists, followed by inputs by naiive users.

Misunderstandings abound. No two of the above list are likely ever to talk over an issue, and you can see so-called security flaws in almost any innocent action Sometimes those 'flaws' are correctable, but the whole process is just too fluid to call the final results 'reliable'.

No, it's not hard, it's an UNDEFINED TASK. Those words just don't amount to a specification that one can test to. You can find folk who will charge to do software testing, if you want to spend money, but what exactly should they test for?

Oh, there's a fresh start with each OS, firmware, driver, software update.

Oh, that's not the problem at all. You call it a mess because the love has ... gone. Ditto the investment.

Burn a bootable DVD of any software and OS that you can find to work well, and dedicate a machine to it. Or, keep old OS and software on a backup volume, and make virtual machines run it. That's why Linux is winning, you know: those actions aren't feasible with Windows's update/genuine/registry scheme.

Reply to
whit3rd

On Jan 2, 2019, snipped-for-privacy@nospam.org wrote (in article ):

Then you?ve already said way too much.

All true. But it is a solved problem. There is a huge literature, all paid for by the US DoD.

But expensive I?ll grant.

Where can one buy a nuclear weapon for a few K$? I want one.

Any major power can destroy a port. The problem is that the victim power will return the favor, with interest.

True enough, ever since WW2, to be able to manhandle the Warsaw Pact.

Hmm. Why didn?t Stalin take the rest of Europe after WW2? He certainly wanted it.

Rome was taken by the Barbarians only after Rome had exhausted itself with civil wars, and had lost the ability to confine for instance the Gallic Wars to Gaul, far from Rome.

On the proper handling of Gauls, see: "Commentaries on the Gallic War?, Julius Caesar et al, 51 to 58 BC.

The sack of Rome byVisigoths under Alaric was in 410 AD, rather later.

Yeah, he is squeezing the Taiwanese. Or trying. But after watching how Hong Kong?s democracy is faring under Chinese rule, the Taiwanese population does not trust Beijing?s claims of one country two systems.

China certainly wants to overtake the US, and perhaps they will do it someday, but I doubt that it will be all that soon.

The Chinese are famous for taking the long view, so they seem strangely impatient. They may manage to blow it.

The Kremlin was always worried about Mao?s China. The fear was that Mao would simply order the Chinese population to start marching West, and simply swamp Mother Russia. No weapons needed. The Russians planned to respond with nerve gas and nuclear weapons. But now that the Chinese economy has eclipsed the Russian economy, it?s not clear what Russia can do.

Joe Gwinn

Reply to
Joseph Gwinn

Why not both?

Use a language that doesn't use pointers.

Well -- they all do, obviously, but you aren't allowed to do pointer arithmetic, and unbounded accesses and checks. A Java array access (or certain string operations) might be straightforward pointer arithmetic, but you aren't allowed to do it that way. You can't cast any Object to anything else, and read that memory as an illegal object (pattern of elements in memory).

Tim

--
Seven Transistor Labs, LLC 
Electrical Engineering Consultation and Design 
Website: https://www.seventransistorlabs.com/
Reply to
Tim Williams

It can often be a good idea to use raw pointers as rarely as possible - and to tie them down when you can. (For example, use pointer-to-const, put your arrays inside a struct, make access functions - anything that helps the compiler spot errors in the code.)

If you are using a language that supports more controlled access, such as C++ references, containers, smart pointers, etc., then these are usually a better choice.

But deep down, at the lowest level, you need to use pointers. If your language won't give you pointers, you are not going to get the most efficient code. That might be a fair tradeoff for most code, but not for everything.

Reply to
David Brown

Most of old commercial and scientific code is written in languages that did not support pointers.

However, for instance in Fortran, you could pass multidimensional arrays of different sizes to a library subroutine, so not so much need for pointers. Of course, the pass-by-reference parameter passing could be abused to do some pointer like tricks :-).

Reply to
upsidedown

On Jan 3, 2019, snipped-for-privacy@downunder.com wrote (in article):

Aside from implementation of algorithms that require pointers, the big problem with Fortran was that it had no way to address something that it did not create and name, which made control of hardware difficult, because I/O registers appear in the memory space, looking like memory. No way to make Fortran recognize this, or to pass a pointer in. Nor could Fortran handle bitwise access all that well, or understand Volatile. There were many workarounds, coded mostly in assembly, and later in C.

Ada was also hopeless for hardware control, and so inherited the same kind of workarounds, which for political reasons had to be in assembly - C was anathema.

Joe Gwinn

Reply to
Joseph Gwinn

If 'reliability' were the only virtue of a computer, we'd still be using slide rules (one moving part, no electricity).

Two moving parts, unless your cursor fell off. Or maybe your slipstick just needs lubrication. ;)

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs 
Principal Consultant 
ElectroOptical Innovations LLC / Hobbs ElectroOptics 
Optics, Electro-optics, Photonics, Analog Electronics 
Briarcliff Manor NY 10510 

http://electrooptical.net 
http://hobbs-eo.com
Reply to
Phil Hobbs

OTOH executing user code should never bring down the operating system. APL was even more of a cryptic terse write only language than Forth.

There were languages that were designed from the outset to be very high reliability but they have never gained traction in the commercial sphere where time to market is everything.

Exploit hardened code is more expensive because you have to guard against very creative exploits that may be far into the future. No one foresaw the long term problems that would come from some of the clever go faster stripes for caches put into modern CPUs until it was too late.

Don't hold your breath. New hardware paradigms intended for reliability like the Transputer and Viper came and went - the latter amid much acrimony. I don't doubt that eventually CPU cycles will be so cheap and human cycles so expensive that using some of the tools already available for static testing of data flow in code will become more common.

--
Regards, 
Martin Brown
Reply to
Martin Brown

He didn't have nukes till '49, which was way too late.

BTW you're falling foul of the mud-wrestling rule. ;)

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs 
Principal Consultant 
ElectroOptical Innovations LLC / Hobbs ElectroOptics 
Optics, Electro-optics, Photonics, Analog Electronics 
Briarcliff Manor NY 10510 

http://electrooptical.net 
http://hobbs-eo.com
Reply to
Phil Hobbs

The Transputer and Occam are alive and well in the *hard*[1] realtime embedded arena with at least one of the original team there (Prof David May).

They are now called xCORE and xC, from XMOS, have been expanding since 2008, and can be bought at DigiKey.

As for Viper - they almost got it right, but a miss is as good as a mile!

[1] IDE *guarantees* timing to the instruction level (i.e. 10ns), by inspecting the (optimised) object code. None of this "measure and hope we encountered the worst case" nonsense :)
Reply to
Tom Gardner

Plus you need a special keyboard, which is why I never got into APL despite there being a strong APL subculture at IBM Watson.

It's pretty amazing though--it sticks bloody-mindedly to lazy evaluation. That's magic.(*) An elevator controller running APL can (apparently) invert a 1000x1000 matrix instantaneously. It just computes the elements you actually use.

Cheers

Phil Hobbs

(*) R. Kipling, 'How the Rhinoceros Got His Skin', from "Just So Stories". Describing the Parsee's cake: "It was indeed a Superior Comestible (that's magic)."

--
Dr Philip C D Hobbs 
Principal Consultant 
ElectroOptical Innovations LLC / Hobbs ElectroOptics 
Optics, Electro-optics, Photonics, Analog Electronics 
Briarcliff Manor NY 10510 

http://electrooptical.net 
http://hobbs-eo.com
Reply to
Phil Hobbs

Memory mapped I/O was not a problem, at least on PDP-11.

Just create a named COMMON area with 4096 INTEGER*2 variables, each representing an internal CPU register or UNIBUS peripheral register. The named COMMON creates a named program section, use the linker to base it at 160000 (octal) to the I/O page.

For a processor without memory management, that is all you have to do. Access a UNIBUS card register is as reading or writing an integer variable. When memory management was used, the task must be made privileged, to map the top 4 KiW of 18/22 bit physical memory into the top 4 KiW of 16 bit user address space.

As for volatility, use a separate compiled function to copy a value from the common area (I/O page) since separate compilation would break any optimization.

The problem with high level languages of the time was that they were not fully re-entrant, e.g. for handling ISRs.

Long before C, BLISS

formatting link
was used as a system programming language.

There were no political reasons, the problem as simply bad reentrancy of old compilers. I remember one Pascal compiler for 6809, which had 'display' registers and the run time library was not re-entrant, so usable only in the null task (user interface) in a RTOS. Using that Pascal for ISR was completely unthinkable.

Reply to
upsidedown

They are far too useful to eliminate completely. Do you really want to pass everything by value - even huge multidimensional arrays?

If you allow array descriptors which include a hard maximum bound then code can be made a lot safer. But there are always ways and means to subvert the system - usually where input output is concerned.

Languages that enforce absolute strict variable typing have to provide an input and an output method for each and every type. It can get clumsy and very hard to read even if the intention is to make it clearer!

There are quite a few static analysis tools that could go a long way to catching many of the sorts of mistakes that human programmers make - but sadly they are not often used :( People always look hurt when I run such tools against a supposedly working codebase and find things wrong.

And if pigs had wings they could fly. Compilers these days are quite remarkable in terms of how well they can optimise code for a given architecture. They do it so well that only a small number of experts can beat them now - especially if you have profile directed optimisation.

There will always be hostile exploits across the boundaries of software and external data. Even essential components of an normal operating system can be used to wreck it. The program that starts two copies of itself being a classic example, another which would arise with monotonous regularity every year is some naive user tries to transpose a large matrix bringing the OS to a standstill with page faults.

Provable correctness is still used in places where it really matters. I was a fan of Z and VDM for a while. But the level of mathematics needed makes it hard to train people to use it. ETH Zurich computer science department had some very good stuff from the Wirth stable of languages including a very early circuit design computer well ahead of its time.

They tend to go together. Badly engineered software will tend to fail in unpredictable ways and some of those can be exploited. Bits of the project still live on in another guise.

formatting link

Amusing fact - Logitech of mouse fame started out as a reseller of the IBM PC port of ETH Zurich's Modula 2 compiler.

--
Regards, 
Martin Brown
Reply to
Martin Brown

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.