Write it in the same language that you are compiling -- that way you
*know* you have THAT tool available wherever you happen to maintain the codebase (instead of having to have *two* tools).
E.g., if I'm embedding a large struct into a Limbo module, I write the "converter" in Limbo as well. Doesn't do me much good to write it in C if I can't be assured having a C compiler alongside the Limbo compiler!
It also eases the requirements on the developer (he may know C but *not* Python, Scheme, etc. -- or, not as well!).
I typically don't include the tools in the version control system - gcc is not too bad, but trying to get something like CodeWarrior into subversion would be a serious pain. But we archive downloaded tools, install them in carefully named directories, and refer to those directories in makefiles. And I avoid updating tools - if I need a new version because of a serious bug fix, or simply to get the latest and greatest at the start of a new project, I install the new version in a new directory. We also make a point of avoiding tools that get locked to particular computers or have other such restrictions (floating licenses are a much better choice), and "archive" old development PC's.
If you have to compile the same code on multiple compilers, then you are restricted to a common subset (or perhaps messy macros and conditional compilation to deal with differences).
The same applies to multiple developers or reviewers, I suppose - you are limited to using a common subset that they are all happy with. That applies to standard features as well as compiler enhancements - just because code follows the "legal" C standards, does not mean it will be comprehensible to others!
I firmly agree with stressing clarity over convenience. Some gcc enhancements, such as the array initialisation, case ranges, typeof, etc., can definitely improve clarity. Others such as function attributes can improve code quality while still being perfectly clear in function. But some, such as nested functions or "conditionals with omitted operands" are going to be far too confusing even in the few cases where they might be of interest.
I still have my copy of Harbison & Steele from when I started my career, with the page marked so that I could easily look up the order of operator precedence and save on using all those confusing parenthesis in my long mathematical or logical expressions.
I was cured of _that_ habit when I started working in an environment where my code got reviewed. Quickly and emphatically, I might add.
I have intentionally never learned the full precedence rules for C operators, to avoid that temptation!
I found it quite interesting to look through the old code, both in what has changed and what has stayed the same. Some of it was obvious - this was pre-C99 code, so no // comments or mixed declarations and statements, and my own fixed-size integer types. There was also very little use of file-level "static", which I now use everywhere.
If you are using Linux then you will always have a native C compiler handy. Of course, you will also have Python, which is a much nicer language for this sort of scripting (it's quick to learn enough Python to write such scripts).
We currently have a setup to do nightly builds of all our code. We've seriously considered, but haven't pulled the trigger yet, on also setting up a build on a virtual machine. This build on the virtual machine wouldn't happen as often, but the virtual machine snapshot would theoretically capture "everything". The virtual machine snapshot could then be checked into revision control.
Sounds like overkill, but it some industries, being able to faithfuly rebuild something 5, 10, 15+ years down the line could be useful...
Regardless of host, the folks *maintaining* the code will be KNOWN to be knowledgeable in *that* (language). Not necessarily the case for C++, sh, perl, python, etc.
There are often little differences in languages to which users are completely oblivious -- that can make significant differences in their comprehension of an algorithm expressed in a language that they may only *casually* know.
[E.g., ARBNO() is a lazy matcher in SNOBOL. Folks coming from a C background (with it's typical greedy matches in regex library) will completely misunderstand the mechanics of ARBNO and incorrectly emulate it's function.]
Early in my career, I was "too clever, by half" and relied on my wider experience/tool base in crafting solutions to problems. Mixing various tools, languages, environments to give me an "optimal" (in terms of development effort) solution. E.g., I was running SysV UNIX w/X at home in the early 80's -- while others were fighting with MS, funky memory models, "overlays", and *waiting* for (illusion of) a "multitasking, GUI environment, etc.
Almost all of those solutions eventually trapped me into ongoing support roles ("But we don't have UNIX, here!" "But Joey doesn't know perl!" "But I don't want to have to purchase..."). And, so, after-the-fact, I found myself back-porting designs to the very same "crippled" environments from which I had originally "cleverly" freed myself (lest I be stuck in an ongoing support role... "life is way too short to spend in support!")
I'm in a similar predicament, currently: do I rely on expensive tools that I own and "force" others wanting to maintain my designs to also purchase them? Or, do I discard my tools and my experience with them *just* to make it LESS EXPENSIVE for others?
It doesn't guarantee that you have it for the host. But, neither does it guarantee that you have python, perl, sh, etc. for the host! What it *does* guarantee is that *you* will know how to write C for that host (more or less)! It doesn't guarantee that you will be able to write a perl script *if* you happened to have perl available to you on *that* host (e.g., none of my windows hosts have perl installed).
[I'll ignore the pedantic case where you can often use the target C compiler to generate the required output -- albeit in a round-about manner -- as an object for the target... that you manually misappropriate back into your development environment!]
In this day of package managers, it's practically irrelevent. If it isn't installed by default, and you can type three words - "yum install gcc" - you're done.
How sure are you that your virtual machine snapshot, taken in 2014 on your current PC and hypervisor, will run on your brand-new PC in the year 2029?
As I understand them, what are called "virtual machines" on PCs only virtualize as little of the machine as is necessary to support multiple OS's on the same hardware, but are not full emulations of the PC processor and I/O. I have not seen any promises from hypervisor vendors to support 15-year-old VM snapshots on future PC architectures, which may be quite different.
This question is of interest to me because I am working on projects with maintenance foreseen until the 2040's. Some people have suggested virtual machines as the solution for keeping the development tools operational so long, but I am doubtful.
"Sure"? How sure are you that the host OS, VM vendor, tool vendor, silicon vendor, etc. will be *around* at that time?
This is worthy of an entirely new thread -- and, some significant professional investment!
Long term support is tedious, at best. Try finding someone who can write 6502 code, today (do-able as the military relies on this processor; repeat the exercise with 2650 and you'll get far sparser results!). What media do you plan on storing the sources (and tools)? Will you be able to *read* that media 25 years hence? (when was the last time you saw a 5" floppy? 8"??
8" hard sectored???) What efforts have you made to document all of these requirements/dependencies/formats? Will people even *know* what they need to have available in order to perform that maintenance?
ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here.
All logos and trade names are the property of their respective owners.