In article , "Wilco Dijkstra" writes: |> |> > |> It's certainly true the C standard is one of the worst specified. However most |> > |> compiler writers agree about the major omissions and platforms have ABIs that |> > |> specify everything else needed for binary compatibility (that includes features |> > |> like volatile, bitfield details etc). So things are not as bad in reality. |> >
|> > Er, no. I have a LOT of experience with serious code porting, and |> > am used as an expert of last resort. Most niches have their own |> > interpretations of C, but none of them use the same ones, and only |> > programmers with a very wide experience can write portable code. |> |> Can you give examples of such different interpretations? There are a |> few areas that people disagree about, but it often doesn't matter much.
It does as soon as you switch on serious optimisation, or use a CPU with unusual characteristics; both are common in HPC and rare outside it. Note that compilers like gcc do not have any options that count as serious optimisation.
I could send you my Objects diatribe, unless you already have it, which describes one aspect. You can also add anything involving sequence points (including functions in the library that may be implemented as macros), anything involving alignment, when a library function must return an error (if ever) and when it is allowed to flag no error and go bananas. And more.
|> Interestingly most code is widely portable despite most programmers |> having little understanding about portability and violating the C standard in |> almost every respect.
That is completely wrong, as you will discover if you ever need to port to a system that isn't just a variant of one you are familiar with. Perhaps 1% of even the better 'public domain' sources will compile and run on such systems - I got a lot of messages from people flabberghasted that my C did.
|> Actually you don't need any "autoconfiguring" in C. Much of that was |> needed due to badly broken non-conformant Unix compilers. I do see |> such terrible mess every now and again, with people declaring builtin |> functions incorrectly as otherwise "it wouldn't compile on compiler X"...
Many of those are actually defects in the standard, if you look more closely.
|> Properly sized types like int32_t have finally been standardized, so the |> only configuration you need is the selection between the various extensions |> that have not yet been standardized (although things like __declspec are |> widely accepted nowadays).
"Properly sized types like int32_t", forsooth! Those abominations are precisely the wrong way to achieve portability over a wide range of systems or over the long term. I shall be dead and buried when the 64->128 change hits, but people will discover their error then, oh, yes, they will!
int32_t should be used ONLY for external interfaces, and it doesn't help with them because it doesn't specify the endianness or overflow handling. And not all interfaces are the same. All internal types should be selected as to their function - e.g. array indices, file pointers, hash code values or whatever - so that they will match the system's properties. As in Fortran, K&R C etc.
|> > A simple question: have you ever ported a significant amount of |> > code (say, > 250,000 lines in > 10 independent programs written |> > by people you have no contact with) to a system with a conforming |> > C system, based on different concepts to anything the authors |> > were familiar with? I have. |> |> I've done a lot of porting and know most of the problems. It's not nearly |> as bad as you claim. Many "porting" issues are actually caused by bugs |> and limitations in the underlying OS. I suggest that your experience is |> partly colored by the fact that people ask you as a last resort.
Partly, yes. But I am pretty certain that my experience is a lot wider than yours. I really do mean different CONCEPTS - start with IBM MVS and move on to a Hitachi SR2201, just during the C era.
Note that I was involved in both the C89 and C99 standardisation process; and the BSI didn't vote "no" for no good reason.
Regards, Nick Maclaren.