Modern debuggers cause bad code quality

I always viewed C as the universal assembly language. And I disagree that it is more error prone than other languages. A bad programmer writes bad code in any language.

Reply to
Ed Prochak
Loading thread data ...

It may have been that, in the past, before the standardisation and before the compilers became ambitious about optimisation and code speed. Nowadays, standard C has more "gotchas" and hard-to-remember rules than a typical real assembly language (reference: recent discussions on comp.arch about gcc "miscompiling" typical C programs, because gcc assumes that C code with undefined behaviour, per the standard, can do anything.)

Have you read the Zeigler study "Comparing development costs of C and Ada"? Strong suggestion that C is more error-prone, and C programs are harder to repair, than Ada. Documents a gradual switch from C to Ada within a company, same development procedures, same developers, reduced bug rate and repair time with Ada.

formatting link

A good language can guide "bad" programmers towards better habits.

--
Niklas Holsti 
Tidorum Ltd 
niklas holsti tidorum fi 
       .      @       .
Reply to
Niklas Holsti

+42

It's all about process.

Reply to
Don Y

Well it is undefined behavior. :)

Yes I will read it.

I like Ada. Wish I could use it more often. In fact I may propose that when we see about changing host OS on our next product. It does help.

I guess there we will have to agree to disagree.

And to keep this about the debugging topic: I have a general rule which you could say applies to testing.

It is not how well it works when it works. It is how well it works when it doesn't work that matters.

IOW, how you handle the error conditions can make or break a product. I'm working in the medical devices field now and I always keep Therac-25 in the back of my mind when designing solutions.

A minor corollary: race conditions are never solved by sleep();

Reply to
Ed Prochak

But the standard hasn't changed: most behaviours that are undefined now have always been undefined. In the era of less aggressive optimizing compilers, people wrote code all the time intending a particular behaviour but that was actually undefined, though the compiler happened to do what the user expected, so the user thought their code wasn't buggy. These days with today's compilers, that same code results in nasal demons, just like the standard specified all along. If C is an assembly language, it's an extremely treacherous one.

Reply to
Paul Rubin

I disagree -- in that the inherent qualities of C lead to poor quality. I think it is the *availability* of the language/tool that has led to a wider variety (read: "range of capabilities/skill levels") of folks *using* that language.

[The same argument as the "modern debuggers cause bad code quality"]

C is relatively easy to port to different machines/architectures. It's "cheap" to implement (at runtime and compile time). It's reasonably transparent (important for folks who have to deal with the underlying hardware -- like writing OS's, drivers, etc.). It's reasonably expressive (I don't have to write a floating point library in ASM for every machine on which I want to develop apps).

Unfortunately, it allows too many "not of the priesthood" to practice its faith! (groan) And, unlike more benign languages (e.g., BASIC), they can actually do serious harm to their code that may or may not be noticeable.

[I don't know if you recall the sorts of cruft folks would write in BASIC... clearly, no understanding of good program/application design... "but, it works"! Counting with a "float"? Gack!]

One of the unspoken goals of most HLL's is to make programming more "accessible"... NOT to require "ordained ministers" but, instead, "lay folk" (practically) to be able to write code (hey, if *they* can do it, then professionals should be able to design *golden* apps!)

Everyone uses floats. Most are aware of roundoff error. But, how many think about order of evaluation when hacking together a series of operations? Cancellation? etc.

On the one hand, more people can do more things with these "improved tools". OTOH, that doesn't mean they know how to do those things *right*. Worse, yet, they may not know what they

*don't* know!

It's relatively easy to get a piece of code to LOOK like it is working -- and then move on (oblivious to the fact that it really may not be working PROPERLY).

Try to imagine what it would be like if there was pressure to make "practicing medicine" as accessible as programming has (tried to) become.

Reply to
Don Y

It is obvious enough for me. The fact is that C tries to be a "universal assembler" as some people see it and it does it poorly (too abstracted from any machine model). There are a lot more details about my vpa which allow me to do things people just can't do in C which are way too lengthy for me to explain to myself let alone other people from the trade so I won't go into it, neither would any sane person want me to :-).

Telling people "don't eat too much" and eradicate obesity is much much easier to do than what you suggest.

Exactly. This is the basic flaw of high level languages. Instead of dealing with text they deal with hieroglyphs - which is much less efficient than just using an alphabet and design your words en route to evolve the language to fit the whims of life. The basic flaw of any (too) high level language is its lack of flexibility to adapt to an ever changing world. Sure changes are made and the phrasebooks get rewritten - but how is this comparable with adding just the new words to the dictionary and twisting the language without needing any "official" approval. Before the change happens years will have passed and gigatons of poor software will have been written (poor simply because the language was not up to date with reality).

OK, your previous post left me with the impression you did, I must have misunderstood you.

Well if this is "the same" the way all human languages are "the same", I could agree. Only if so.

If a project which takes over a month of programming is "too large" in your book then OK, I will agree with you that copying this and that and putting something together in a week or two is better done using a high level language, yes. But one month of programming is nowhere near what is a "large project" in my book.

Dimiter

------------------------------------------------------ Dimiter Popoff, TGI

formatting link

------------------------------------------------------

formatting link

Reply to
Dimiter_Popoff

Of course I agree that C has evolved to what it is for a reason. It is just that the reason is much too distorted - e.g. the messy x86 architecture being one of the important reasons most people (other than myself :-) ) abandoned further development of lower level languages. Then good architectures simply came with barely usable assembly - e.g. power, the best architecture I know of with mnemonics no sane person would try to write much code for (which is why I did my vpa for power).

I understand that I am practically alone against the rest of the world so I am not trying to convince anyone here. I am just stating my thoughts, someone some day might find something he is after. For now I just use my vpa (which owes a lot to 68k assembly, I built on it - in fact it can still "assemble" 68k sources and produce power code, though it can do more, especially when it comes to handling variables in the text, macro flexibility etc. etc.) to the advantage of what I design :-).

Dimiter

------------------------------------------------------ Dimiter Popoff, TGI

formatting link

------------------------------------------------------

formatting link

Reply to
Dimiter_Popoff
[]

That was an impressive study. They shot down pretty much every counter argument other than Ada may not do as well for small projects.

THANKS. I look forward to their updates. (data past 1994, and the other study on C++)

Reply to
Ed Prochak
[]
[]

Sounds like FORTH.

No language will ever be able to keep up as you propose and still be used b y many developers. Applications reflect the reality, the language only help s.

[]
[]
[]

I guess that is why you keep you language to yourself. What do your custome rs do when you finish the project? Or is this the "code so complicated I can never be fired" approach to progr amming?

Especially when I program under contract, my goal is to essentially work my self out of the job. I have even had folks that took over my code complimen t me on its maintainability. It sounds like you take an opposite approach. Good luck with that.

ed

Reply to
Ed Prochak

I have never said it is easy to learn to write good code using a lower level language. It is not easy to write good literature in English either, few people can learn the language that well. Yet English and other alphabet based languages are distinctly superior to hieroglyph based ones when it comes to literature.

I am sure you will find my vpa code easier to read than you would read code written in C or whatever the high level language is that you are most familiar with. There is no phrasebook you have to learn for years, remember. Writing is another matter of course but maintaining code does not take a lot of it.

Like I said earlier I am not trying to convince anyone. And now I must really turn a not so small thing in my vpa into something I can finish by tomorrow night so I will have to stop posting here and get back to work :-).

Dimiter

------------------------------------------------------ Dimiter Popoff, TGI

formatting link

------------------------------------------------------

formatting link

Reply to
Dimiter_Popoff

+1

But, it can't prevent the programmer from abusing the language (in ways that the language designers may never have imagined) and still cranking out garbage.

Any tool/policy/process (ideally) wants to make "doing it right" (whatever that means) EASIER than "doing it wrong". So, experience trains the user to take the path of least resistance FOR HIS OWN INTERESTS.

Reply to
Don Y

I think after the mid 1990's, the role in the regular IT industry where Ada might have displaced a lot of C and C++ was instead filled by Java. Ada was relegated to a defense/aerospace realtime niche. But here's an old article comparing C++ to Ada-95:

formatting link

Reply to
Paul Rubin

I haven't seen the thread in comp.arch, but do you have any particular situations in mind?

And what do you suggest gcc /should/ do about undefined behaviour? Make wild guesses about what it thinks the user actually intended?

In some cases, the compiler will allow the user to define the behaviour

- such as by compiler flags that make signed integers overflow as two's complement (even though code will almost never use such a "feature", and changing it reduces some optimisation opportunities in good code). In many other cases, undefined behaviour is fairly obvious if the programmer things about it (and programmers /should/ think!) - dividing by zero is undefined, so the compiler can assume that you don't care what will happen if you try it.

Reply to
David Brown

"Not instrumenting your code" is a red herring. It can take me as much if not more time to configure a debugger as adding some output code to my program.

I don't get the "for free" part. There is always a *special* debugger pod that requires a unique connector on the board... a connector which is virtually never used in the intended app of the unit while a serial port can serve double duty in minimal hardware designs or a built in display can serve double duty.

Not sure what you mean. The debugger interface can be used in production for programming the Flash, but my experience has been that production people don't want to deal with design tools in production.

--

Rick
Reply to
rickman

Yes it is sad that the language "required" for American government contracts is popular mainly in Europe.

Maybe we should start another thread about languages and how they seem to be picked by popularity and not applicable features.

Reply to
Ed Prochak

Exactly. This is the argument I use with clients who don't want to specify how something should behave in a particular set of circumstances. "You want me to read your mind? Heck, why don't I just do what's EASIEST for me? Even if that's not what you would have wanted had you taken the time to consider the behavior that you WANTED, here!"

Really. Think about it. You're telling the compiler (in your code) to do "something"... but, how is it supposed to know what that "something" is -- when CONTRACTUALLY you've opted to do something that is not defined?

Gee, what a great catch-all for BUGS! "It's not MY fault the code is misbehaving! It didn't read my mind properly!!"

Reply to
Don Y

That's just not been my experience. Since the mid 80s, I can count the number of times I've felt like going to assembly on one hand.

But there really is a problem using "English like" words. COBOL went that way and, while not exactly deprecated, isn't widely used outside of , say, banks.

Seems like punctuation marks are pretty useful.

I personally do not find this an impractical limitation.

I feel like 'C' is a better choice. The set of programmers for it is larger and it's modestly more expressive.

Ah - well, it takes some digging and you have to be prepared to ignore differences that are smaller :) but all human languages can be arranged in a tree structure. Turns out there might be more in common than in difference. Differences tend to be things added after a population moved to a different place and the language evolved.

You won't get to a million lines in a month. Ten times what you get in a month won't take ten months; it'll likely take more - complexity is arguably O(n^2) or O(log(n)) of number of lines - using the term "complexity" to approximate cost.

Nor mine. I just mean that a project should be small enough to make verification and validation tractable problems.

If you have, say, a Linux distro which is several hundred thousands or millions of lines, it's actually multiple smaller projects bundled together.

--
Les Cargill
Reply to
Les Cargill
[...]

It takes seconds to get a live display of dozens of variables, including structs and arrays. Hardly possible by "adding output code" and likely a runtime execution speed problem. BTW instrumenting code is usually more invasive than background debug.

cheap these days, e.g. Segger J-Link comes with free eval boards. Better adapters starting below 1000EUR.

The port is usually necessary for in-system-programming.

A scriptable debugger enables automated tests in the target hardware.

Production people use production tools but the same "connector" on the target. Can be just four

Reply to
Oliver Betz

The subject of the thread was "If It Were Easy...".

IIRC, some of the things discussed in the thread were strict aliasing rules and other pointer-punning and type conversion issues -- apparently memcpy() is the only well-defined way, and the traditional "union" trick is not. But surprise! gcc may optimise away a memcpy() call, possibly just reusing the source data in situ.

Also discussed was one case in which code in the Linux kernel first dereferenced a pointer, and then tested if the pointer was null -- gcc omitted the test, because the dereference would cause undefined behaviour for a null pointer, so only the non-null case needed to be compiled. The programmer apparently knew that, in the kernel context, a null pointer can be dereferenced without harm. But what was done after the test failed badly if the pointer was null.

I think there was also mention of gcc making loops eternal, or deleting them entirely (I don't remember which) if loop termination depends on signed-integer overflow.

It was a long and bitterly argued thread, where the "traditionalist" C-is-a-portable-assembler advocates essentially claimed that the C standard committees and gcc maintainers are pushing C to become too much a high-level language and are destroying its predictability for low-level programming, except in the hands of very careful programmers.

I don't have much of an opinion (I avoid using C when I can).

I think I see the point on both sides of the argument. The traditionalists want C compilers that emit machine code that "does the same thing" as the source code: if there is a pointer dereference in the source, there should be an indirect load/store in machine code, even if the pointer might be null; if the source tests for a null pointer, there should be a comparison instruction and conditional branch in the code, even if earlier code has done something that makes the behaviour undefined if the pointer is null.

The modernists want C to have a well-defined standard and application portability where possible, which unfortunately (given what C is like) means that many things one can write in C, and even compile, will have undefined behaviour across implementations -- but whether the behaviour is defined or undefined (or something in between, such as implementation defined) often depends on run-time dynamic things, so the compiler cannot just reject such code.

What IMO is doubtful is for the compiler to latch on to the possibility of undefined behaviour of some part of the code, under some circumstances, and eagerly assume that in those circumstances it does not matter what the code does, in that part or following parts. The modernists claimed on comp.arch that this compiler behaviour follows from the general code optimisation methods, and that it would be hard to report, as warnings, optimisations that depend on the "no-undefined-behaviour" assumption. I'm not quite convinced about that.

But in some assembler programs, specific run-time errors such as divide-by-zero are sometimes triggered on purpose. Those who see C as a portable assembler would like the expression "1/0" to generate a division causing this error, even if the behaviour is undefined in the C standard.

I agree that, formally speaking, C was never a "portable assembler". It was just the simple compilers that made it appear so.

--
Niklas Holsti 
Tidorum Ltd 
niklas holsti tidorum fi 
       .      @       .
Reply to
Niklas Holsti

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.