job is instead of
Please don't insult pigs!
job is instead of
Please don't insult pigs!
c basically gets that from assembler.
Yes - much better good old fashioned messages such as "Error 1267 - refer to manual"
-- Dirk http://www.transcendence.me.uk/ - Transcendence UK http://www.theconsensus.org/ - A UK political party http://www.onetribe.me.uk/wordpress/?cat=5 - Our podcasts on weird stuff
Non-overly-wordy languages don't have such errors in the first place...
The real problem is that the market tolerates such things. If a building fell down the day after it was finished, someone would get seriously sued. That very rarely happens in computing (it's not unprecedented, but certainly rare).
Another problem is the extent to which companies get away with selling vapourware, and overstating the capabilities of the stuff that actually exists. In any other industry, the word "fraud" would come readily to mind.
Some years ago, I wrote this page
Sylvia.
Best programmer I know was originally a baker. Seems there's a hidden resource out there.
The point being that the gunnery education was worse than useless. cf computer science?
John
Is that really the way they spell "triple" in Aussie-land?
I'll ask my client in Adelaide :-)
...Jim Thompson
-- | James E.Thompson, P.E. | mens | | Analog Innovations, Inc. | et | | Analog/Mixed-Signal ASIC\'s and Discrete Systems | manus | | Phoenix, Arizona 85048 Skype: Contacts Only | | | Voice:(480)460-2350 Fax: Available upon request | Brass Rat | | E-mail Icon at http://www.analog-innovations.com | 1962 | I love to cook with wine Sometimes I even put it in the food
Should be trippple.
John
Could be true. Don't ask me about it. I have taught it at University level because they needed someone to fill in for a while, until they could "find a steady." But physics is my world, not computer science, and at the time I was learning I think people were better served learning computer science on the side, not as a field of study absent others. It's the way I did it. Never took a single computer science course in my life.
So no argument from me about that period. I'm not sure about today. The general argument you make may still hold substantial water, but I can imagine some possible cases where one might debate things a little. Still, I'd probably want to take your side on it.
Jon
ere.
nThey are making a comeback in some microcontrollers.
We could go to segmented non-flat memory architectures if people really believed security mattered. The trouble is that a truly secure Windows OS would break all the legacy peeky pokey games code. And to be fair computer games are actually a lot more reliable than the average consumer shrink wrap software.
d f. :-)
'd
l you
have
aBetter to have the extra word than to find that the compiler has silently made the wrong assumption about what you had intended. Allowing the compiler to suggest corrections during compilation is one way out of this - several modern compilers and IDEs can highlight things they don't like and/or offer continuations. Syntax directed editors are a step in the right direction.
The error messages that really wind me up are of the form "missing END" or "missing }" detected at end of file (rather than pointing at or giving the line number of the unbound opening bracket). I have lost count of the number of compilers where this happens when a single mismatched opening bracket sneaks in.
Regards, Martin Brown
That has only been true in the era of 32bit CPUs.
Prior to that the physical address space of a machine was typically twice the width of its working registers (although some had a more limited subset). And in the very early days the length of a byte was not always 8 bits.
Regards, Martin Brown
No, it was (often) true in dos on the 8088
the Z-80 also had registers that could hold a 16 bit pointer or do 16 bit arithmetic.
I don't know a whole lot about the architecture of the vax that the first C compiler was developed on, but I'd be most surprised if it didnt use the same registers for both arithmetic and indirection.
I think this integer versus pointer thing is a bit of red herring anyway. Early versions of 'C' were a bit strange by modern standards, but ANSI C requires that pointers have a defined type before they can be used.
The problems lie in the fact that pointers can be manipulated in ways that mean there is no guarantee that the pointer actually points at something of the type specified for the pointer. And in any case, the memory pointed at might have been freed back to the memory pool and reused, or might be in a stack frame that no longer exists. Such errors are easily made, and create potentially exploitable holes (if you're lucky the program simply breaks, but there are no guarantees).
It doesn't help that management can be so indifferent. Way back when, I'd implemented memory routines for C that performed various checks for improper memory use - for example guard values at the beginning and end of allocated memory areas that were checked when the memory was release. When these checks were flagging errors in a piece of software that was not otherwise failing in any obvious way, my manager asked me whether I could just suppress the error messages that resulted. I point blank refused.
Sylvia.
The intel engineers weren't smart enough to figure out how to use/implement it?
Thanks, Rich
Nah, they figured out the harvard architecture for the 8051. ;-)
Most of Intel's architectures were horrible kluges, out of the mainstream of computing. One can say the same about Microsoft's software.
John
It isn't C's fault that compiler (and OS) developers provide the simplest implementation rather than the most robust. C doesn't actually require you to use a single stack for both data and return addresses, nor a common address space for code and data.
Nor is it C's fault that programmers omit bounds checking (or, more generally, that they are writing applications using a language which was created for writing operating systems and device drivers). Sure, there are languages which will do bounds checking for you. But you wouldn't want to (or even be able to) use many of them for writing device drivers or code which needs to fit into a few KiB of RAM.
It's not just the omission of bounds checking by programmers that has caused problems. Sometimes programmers have tried to do the right thing, but simply got their code wrong. Humans are fallible, and it's an easy mistake to make. Some errors are quite subtle - even code that is superficially correct can be subverted if it's vulnerable to specified sizes that are treated as signed quantities on one place, and unsigned in others.
What's required is someone with deep pockets who suffers damage at the hands of faulty MS software written in C or C++ to sue for negligence - the negligence being that the software was written in those languages. Then the message might start to get out that using these languages for application software is just too risky.
I'm not holding my breath though.
Sylvia.
Much of the x86 cruft was to maintain backward compatability. IBM has done the same, though more successfully. ;-)
There you have me. M$ hasn't even tried to maintian backward compatability. Different interests.
ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.