Code metrics

By warning instance. They claim to be able to track issues even if the source changes, so if you get a new warning of the same kind at a different place, that's a new one, but if the current one moves, that'll stay the same one.

Stefan

Reply to
Stefan Reuther
Loading thread data ...

Dealing with that for a single compiler is not too bad, most let you insert a pragma to disable a warning. Just insert the disable before the line of code in question, and reenable after, and you're good.

Then you get to the second compiler, and ugh...

The worst is when you get a compiler that starts warning you about something perfectly legit, or even a standard idiom (MSVC likes to warn on the standard idiom where you wrap the code in a macro with "do {...} while(0)", for example).

We have a no-warnings rule here, along with high warning levels, and it can certainly be a game of whack-a-mole at times.

At least something like PC-Lint lets you use the same annotations on multiple platforms.

Reply to
Robert Wessel

That's where this subthread started.

Then, please fix: someptr->somearray[someindex].m_u16++; ("warning: conversion from int to uint16_t may alter its value")

This sounds great in theory, but is not such a good idea if your project is behind schedule and HR is already bringing on people from across the continent. (Plus, workers' rights legislation probably does not allow "you programmed a warning" as a firing reason.)

The more sustainable solution is educate people and develop best practices. Things like: make your Thread::run function return void, not int, or void* (POSIX), or DWORD (Win32). This avoids warnings ("no return statement") in daemon threads and also avoids that people try to take the shortcut and return fancy things through it. Or: do not put the "myenum_MAX" value at the end of enums; instead, make it a separate 'static const uint'. This avoids the "enum not handled" warning and also the mixed-type comparison warnings mandated by MISRA when you're checking a user-supplied integer whether it fits into the enum. But that needs time (and resources) to develop, and permission to refactor.

Stefan

Reply to
Stefan Reuther

No. That doesn't tell which warning you've disabled checking for (at least not with the compilers and tools I have experience with). And it doesn't tell you if the compiler/tool doesn't find a reason for a warning there.

Yes. :-(

So far I usually don't have the pleasure of developing projects for compilation with multiple compilers, but it is definitely something that may make life "interesting" - even if it technically shouldn't be much worse than handling warnings from various tools (besides the compiler).

Greetings,

Jacob

--
"But I mean it's so unfair ... but I don't know who for yet" - Djibard
Reply to
Jacob Sparre Andersen

I can't without more context.

But I have to get through DO-178 certification. I does not help to just crank out code, even if it would work. I don't need to fire him/her, but that person should not be part of the project team before some real education. I agree with your next paragraph, of course. And with proper planning - and that _is_ part of the software development process - the sudden need for more manpower should not arise.

Totally agreed.

--
Reinhardt
Reply to
Reinhardt Behm

I think both of those criteria may prove to be elusive! You can't count on pragmas (portably).

You *could* use comments -- but how do you tie a particular comment to a particular line of code? What if a line throws multiple warnings? Esp if you want to explain why a warning is unnecessary! (if you're going to go to that length, why not just fix the code??)

You could wrap the offending term/expression/etc in a macro that doesn't alter the code generated -- but, isn't that the same effort as fixing the code?

Etc.

See above. At what point does the programmer decide its just easier to silence the warning (perhaps INCORRECTLY doing so)?

You'd also face problems when the compiler was revised (or, in my case, when you move to a different compiler). You'd have to create a set of "named exceptions" and then find a way to map specific warnings from specific compilers *to* those exceptions. (this would also give you a way to handle compilers that don't warn about as many things)

No. *My* solution is to just not accept warnings. It means that each time I port code to a different platform, toolchain, etc. I end up having to tweek the code a bit. Sometimes, compilers catch behaviors that I was sloppy with (because older compilers gave me more "rope" -- with which to hang myself!). Other times, the compiler is just "awfully helpful".

The *language* needs to be fixed instead of inventing kludges to bolt on capabilities that the compiler should have inherently.

Personally, I'd love much stricter type checking, etc.

E.g., if foo() can return three values {A, B, C}, then I'd rather have a type called foo_return_type_t that is an enumeration of those three values (regardless of their *actual* "values") so statements like: if (foo(...) < 2) {} get flagged. "What the hell is '2'"? -- even if, e.g., (effectively) #define (B) (2) "Oh, you mean 'if (foo(...) < B)'?"

Reply to
Don Y

(snip on warnings, compilers, and debugging)

But what it the code isn't broken? Just because some compiler writer thought that you shouldn't do something doesn't mean that it is wrong.

Note that Java requires the compiler to figure out that an assignment is made to a scalar variable (not arrays, though) before the value is used. Compilers are getting better, but there are still some cases that the compiler can't figure out. I then put an initializer on it with a comment like

int i=0; /* the compiler didn't figure this out!!! */

Yes, if the code is right, I would rather go through the extra work to explain it, that "fix" it.

-- glen

Reply to
glen herrmannsfeldt

In MSVC you'd do something ilke:

#pragma warning(disable : 4701) ...some code causing the warning #pragma warning(default : 4701)

Or a tiny bit better:

#pragma warning(push) #pragma warning(disable : 4701) ...some code causing the warning #pragma warning(pop)

In GCC it's similar:

#pragma GCC diagnostic push #pragma GCC diagnostic ignored "-Wuninitialized" ...some code causing the warning #pragma GCC diagnostic pop

Reply to
Robert Wessel

That is one reason why I am also very picky when choosing new CPUs. If there are only adequate tools like compilers I might not use the newest fancy CPU. It is just too much effort.

For example I found a compiler from a CPU vendor which complained about a function defined with an uint8_t parameter and being called with a constant

  1. It saw the constant as an int and warned about an int being truncated to a uint8_t. After more of such nonsense and no other compiler available the CPU was out.
--
Reinhardt
Reply to
Reinhardt Behm

Oh, you're in for *loads* of fun, then! :>

First thing you'll learn is how all those delightful, COMPILER-SPECIFIC enhancements now are *liabilities*!

Sort of like buying a new car and discovering the steering wheel is... IN THE BACK SEAT!

Reply to
Don Y

Yes, all 'C' compilers do...

Not all do that. Were there any options to turn that warning off?

I find that with ARM and PIC, no such problems.

--
Les Cargill
Reply to
Les Cargill

Not really. It only had several "warning levels" that could be selected. By lowering the level other useful warnings also went away.

--
Reinhardt
Reply to
Reinhardt Behm

Well, it's not as if the compiler writer is trying to enforce "coding guidelines" -- that mechanism should be part of a different HIGHLY CONFIGURABLE tool. Rather, it's (usually) trying to alert you of subtle behaviors of which many programmers may not be cognizant

*or* particularly vigilant.

The question becomes, "why couldn't you have come up with an appropriate initializer?" Or, why is your program logic so "unpredictable" that the compiler can't sort this out with a static analysis? (i.e., if the compiler can't sort it out, are you sure *people* will be much better at the task?)

E.g., I *don't* like the UNIFORM practice of initializing variables at their declaration (this isn't even possible in all languages!). Instead, I prefer to initialize them closer to their first use.

It's tedious to have to scroll back to the top of a function to discover where the variable was declared & initialized. OTOH, if you initialize it "when it becomes of interest", then you are more likely to see that initialization in the "local" code.

Gratuitously adding extra nested blocks just so you can declare/define the variable more "locally" OUT OF HABIT clutters up the code. And, for folks not accustomed to this, can confuse esp if you reuse an identifier in this nested scope.

The real problem lies in the squishy nature of C inherently being at odds with the idea of "portability" (which requires CONSISTENCY across platforms, etc.)

Reply to
Don Y

+42

The same holds true when walking backwards through time: e.g., trying to support old hardware with old tools. You may not have a choice as to the capabilities of the compiler, etc.

But an explicit cast should have fixed that.

Would you complain if the compiler prodded you when you tried:

float x; x = sqrt(6);

(two "warnings", there)

Reply to
Don Y

Am 10.03.2015 um 23:24 schrieb Don Y:

That's no solution. That's wishful thinking.

The basic problem is that C compilers are officially and explicitly allowed to "warn" about whatever they damn well please, and they make ample use of that leeway. Trying to keep a lid on that can of worms is an exercise in futility. For all you know, tomorrow's compiler update might add a "warning: no C code should be compiled on a Friday, 13th".

There's just no way to ensure that the same source will compile without warnings on more than one compiler at a time. The closest approach that might appear to work would be to attempt to turn warnings off for all compilers you use ... but even setting aside that this would entirely defeat the purpose of the "no warnings!" approach, it may not even work: there's no requirement for compilers to offer a "all disabled" mode.

So you face a choice: either you completely give up on re-using source code verbatim from one project to the next (so: parallel maintenance of multiple versions of your "code library", forever!), or you give up on that strict "no warnings!" plan.

Well, actually an "accept no warnings" policy can be feasible, but not regarding compiler warnings. You'll have to pick a _different_ source analysis tool whose warnings you're going to actually care about, instead of the compiler's: one whose warnings can be controlled precisely and consistently for all target platforms. One sensible choice for such a tool, in my experience, is Gimpel's lint. Others apparently like QA-C for this.

This is not at all a question of capabilities (or lack thereof). It's a question of what do you do if, in the only available compiler for the platform someone else has decided you'll be using, there's simply no combination of compiler switches that still yields some truly necessary warnings (e.g. "function called without a prototype declaration"), but keeps quiet about perfectly sane constructs (e.g.: "global pointer initialized with address of a file-scope function/variable"). And of course the other compiler will have "warnings" of the exact opposite meaning ("object defined with excessively wide scope").

And that's before we begin looking at different compilers' predilections about how to flag a function argument as unused (void cast? self-assignment? none?), the "right" amount of parantheses, or actual differences caused by different native data types on different platforms.

Reply to
Hans-Bernhard Bröker

(snip)

(snip)

(snip)

Another that Java requies is casts for narrowing conversions.

Too late to add to C, but it doesn't seem a bad idea to me.

-- glen

Reply to
glen herrmannsfeldt

But they are. "There are rules" - Walter Sobchak.

Some of the more popular compilers just shaddup already about it unless the constraint violation is significant.

The compiler is, at its core, dumb. This is a good thing.

Oh, I think initialized variables are wonderful. But then again, I had to *implement* mutable initialized variables once - which is not nearly as cool as it sounds.

The toolchain simply located initialized variables to PROM, and uninitialized variables to RAM. So we made our own segments, then copied from initialized to a peer to BSS in the startup. There was more to it than just that, but that's the gist.

It's funny - I see this an extension of the RAII principle, and then people go off on the exact history behind RAII in C++. Yes, but ...

Modern 'C' means never having to do this. Declare a block for all the temp variables surrounding a thing:

double tane = 0.0; double angle = 42.0; const double xepsi = ( 0.00000...1 ); // yadda yadda // at the end of this, tane is completed. { const double sinX = sin(angle); const double cosX = cos(angle); tane = (abs(cosX)>xepsi) ? sinX/cosX : TOOBIG; // or whatever ... }

*NOT* doing this leads to all sort of pernicious headaches.

I vehemently, fundamentally and absolutely disagree. It organizes the code. The temps all go away at the bottom of the block. You will not get a sore wrist from a little scrolling. But having all references to an identifier be in the same general region *works*.

For stuff that's globalish state there is 'find . -name "*.[ch]" | xargs grep -n ...'

You need that anyway. No, your favorite IDE don't do that right either. :)

Code tells a story. Sequentially organized little paragraphs are as natural as reading. Having to find the subroutine is harder than this.

Of course, subroutines are the next logical step - if it's more than a few lines, the declarations are not the major point of it and/or it needs to be reused.

It should take seconds to get used to.

Nah. The travails of this are Vastly overrated. People get too excited and overdo it. Again - the declarations are the key.

--
Les Cargill
Reply to
Les Cargill

I assume a Lebowski reference will work here: "There are rules" - Walter Sobchak.

In fact, there is The 'C' Standard and if you use 'C' you should at least become partially familiar with the music of it.

It's not exactly Human but it'll help.

Nope.

Or just do the things that are necessary to make the warnings go away.

This will involve understanding the promotion and constraint rules. If you don't keep those in mind, you *will* make *ugly* mistakes that will fail at the worst possible times.

Nobody said it was easy.

You really should have prototype declarations on general principle.

That's because it cannot know any better. A "lint" tool that does some better probing might.

It depends. This goes back to the mores of project management and is extremely local.

--
Les Cargill
Reply to
Les Cargill

That reminds me of compilers that where at least error free. For example I used C/80 for Z80 from '83 on until about 2000 intensively and did not find any error. And it did not warn much. ;-) Ok, the compiler was just 40k. So there was just no space for errors. ;-)

In contrast to some compiler for another CPU I used around 2000. It was just a collection of bugs. We had to replace about 500 OTP chips because of a bug inserted by the compiler. It was from a company whose owner has several times in the past insisted in this group that open source is not reliable but his tools are thoroughly tested and verified.

func((uint8_t)1); looks nasty.

True, but at that place the compiler could have deduced, that there is no loss by truncation.

--
Reinhardt
Reply to
Reinhardt Behm

All required context is in the line of code and the warning: m_u16 is a uint16_t structure member, and int has more than 16 bits. The only "fix" to this warning is someptr->somearray[someindex].m_u16 = (uint16_t)(someptr->somearray[someindex].m_u16 + 1U); and I doubt this makes the code more readable.

DO-178 might be a better argument against schedule cuts and bad requirements than SPICE, but somehow I doubt it...

Stefan

Reply to
Stefan Reuther

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.