Using Makefile rather than GreenHills (GHS) Multi .GPJ files

Anyone using Make rather than gbuild/Multi? If so, have any advise concerning doing this? Also, are there any translation tools for migrating my .GPJ files to Makefiles?

Thanks!

Bill McCloskey

Reply to
zzmcclos
Loading thread data ...

From second-hand experience I can say Green Hills' compilers don't need any special treatment by Makefiles. I.e. if you know your way around Make, there's nothing special to learn about using it with GHS. GHS even exports dependency information in a suitable format.

The trickiest aspect may be the selection of a Make tool that works well in a MS Windows surrounding, where people habitually don't care about letter case in filenames. Cygwin's port of GNU make is the usual choice, but for this application it may not be the optimal one.

Reply to
Hans-Bernhard Bröker

What was really meant by this is that using opus make, there is no problem in MS Windows env. However there is a considerable backlog of the use of .gpj files (formally .bld files) that are used. This not proving to be extensible like make, where lint and other tools have to be inserted into the build process. Is there a way to convert .gpj files for use w/ make or otherwise have .gpj files support insertion of other tools in the processing of source files?

Reply to
zzmcclos

The "msys" *nix tools, such as make, are often a better choice than cygwin unless you really need a full posix environment. MSys/Mingw binaries are smaller, faster, more "native" (they handle things like incorrect case better), and don't have the dreaded "cygwin1.dll" problems. On the other hand, cygwin supports things like fork(), symbolic links, and a much more complete set of tools and libraries.

Note also that Green Hill's is, I believe, a bit fussy about letter cases - they changed the ordering of files between versions (i.e., does "a" come before or after "Z" ?) which affected link ordering for a user I know.

Finally, a good idea with makefiles is to write generalised makefiles that compile *all* the ".c" files in your working directory, rather than trying to specify them individually. With a little effort you can write a makefile that will work for many different projects, and does not need updating when you add new source files or change dependant headers.

Reply to
David Brown

No problem. The command line compiler looks and feels like any other

*nix command line compiler.

Such a tool is shipped with the toolchain (I just don't recall its name). Look around your ghs directory or help file. However, last time I looked, the Makefiles it generated were far from maintainable. The .gpj option inheritance scheme just cannot be modelled easily in a Makefile. That's why we're using Make-based compilation only for small utility projects.

Stefan

Reply to
Stefan Reuther

How so? Did they actually have their link order controlled directly by a wildcarded file listing, or what?

I beg to differ. A random C file appearing in the source tree (e.g. a temporary copy of one of the source files) shouldn't become part of the build just like that. It's just way too easy to pick up dead code that way.

It would also make it unduly hard to create variant project sharing most, but not all of their source files.

Changing dependant headers shouldn't require changes to the makefile, yes. That's why Make can only really be used well with compilers that export their dependency information in a format usable by 'make', such as GCC or Green Hills' compilers.

But changing the set of source files to be used _should_ require a change of the makefile. The makefile should be considered the document defining which modules make up the program.

Reply to
Hans-Bernhard Bröker

I don't know the details, but I guess something like that. In general, you've done something weird if you are dependant on the order files are passed to the linker - it is better to make use of specific sections or specific linker control files as needed. But it can happen accidentally

- things like the order of construction of global C++ objects can be affected by the link order. It's just another of the possible issues you can have when changing tool versions in the middle of a project - and other reason why such changes should not be taken lightly.

Here we have a difference of opinion - I believe that if a file is not part of the project's source, it should not be in the source tree. It's not exactly hard to make an extra directory for other files, such as experimental versions, old versions, or other temporary copies. Nor is it hard to mark a copy as "old" by appending a ".old" to the file name.

No, that's not hard - you just have to do it in a different way. Copy only the files that are of interest, or disable the ones you don't want in your project (such as by adding an extra filename extension). If you are often doing this with the same set of files, and want to disable different files for different projects, use "#if .. #endif" wrappers and define some "enable" preprocessor symbols. Relevant files are in the correct place, the symbol definitions are collected together in a single "configuration" header file, and it's easy to see which parts of the project are included in the compilation, and easy to change those choices.

I've used gcc's preprocessor to generate dependency files for make while using a different compiler for the actual target compilation.

For many projects, however, I find it is good enough to set the makefile to say all ".c" files are to be compiled, and each compilation is dependant on the relevant ".c" file and *all* the ".h" files in the directory. Obviously that leads to a few unnecessary extra compilations, but it works well enough unless the project is particularly large.

If you consider the makefile in this way, then fair enough. I consider the set of source files (for most projects) to be fully specified as precisely the ".c" files in the project source tree.

Reply to
David Brown

Not necessarily. There's the situation to be accounted for where you have a working build, but now you want to make a major modification (change tool versions, re-organize variant handling, ...), which must not have any effect on the final product just now. So you have ensure identical program images before and after the test. Spontaneously changing link order is an immediate disaster in that situation.

This would tend to be an argument against compiling *.c in a Makefile setup --- wildcards can't be guaranteed to expand to constant link lists.

I consider that a strong argument against C++ as a language for embedded processing.

More to the point, linkage order is not something to be left to automatic handling by the IDE, Make or whatever. It _has_ to be manifestly present in the project definition, and editable by the use.

[ on Makefiles built to compile all *.c files in their directories...]

Looks like I didn't manage to explain what I had in mind. I wasn't talking about files to be disabled permanently in variant copies of the source tree, but rather about multiple project variants kept in a single source tree. I.e. without copying the entire tree, such that changes to the common parts of the code automatically apply to all variants.

Good point. As long as the main compiler doesn't have preprocessor features that even GCC in its sometimes arcane featurism doesn't have, that is.

Reply to
Hans-Bernhard Bröker

Not necessarily. There's the situation to be accounted for where you have a working build, but now you want to make a major modification (change tool versions, re-organize variant handling, ...), which must not have any effect on the final product just now. So you have ensure identical program images before and after the change. Spontaneously changing link order is an immediate disaster in that situation.

This would tend to be an argument against compiling *.c in a Makefile setup --- wildcards can't be guaranteed to expand to constant link lists.

I consider that a strong argument against C++ as a language for embedded processing.

More to the point, linkage order is not something to be left to automatic handling by the IDE, Make or whatever. It _has_ to be manifestly present in the project definition, and editable by the use.

[ on Makefiles built to compile all *.c files in their directories...]

Looks like I didn't manage to explain what I had in mind. I wasn't talking about files to be disabled permanently in variant copies of the source tree, but rather about multiple project variants kept in a single source tree. I.e. without copying the entire tree, such that changes to the common parts of the code automatically apply to all variants.

Good point. As long as the main compiler doesn't have preprocessor features that even GCC in its sometimes arcane featurism doesn't have, that is.

Reply to
Hans-Bernhard Bröker

Again - if the correctness of your program depends on the order files are passed to the linker, you've done something wrong. Linker files are where you put any order-specific requirements - it's the logical place to put them, and it leaves your build independent of the ordering of the files.

It's certainly an argument against using C++ (for any program, embedded or otherwise) without understanding all the issues involved. Getting your global objects initialised correctly (when the order is important) is not hard, but it is easily forgotten - resulting in a program whose correctness is dependant on the link order.

Link order is normally irrelevant, and thus should not have to be specified. If it is not irrelevant to your project, and you can't fix the cause of your dependency on the order, then fix your linker file.

I think it that case, I'd go for #if..#endif on the files, or split the files into different directories (a common directory, and variant-specific directories, for example). Another alternative is to build the common files as a library.

Of course, there are going to be occasions when specifying files and/or directories explicitly in the makefile is the best policy. If you need to specify different compiler flags for some of the files, for example, the makefile is probably the best place to put this information.

It's also possible to use gcc's warnings (which are not as powerful as a full lint, but better than most other compiler's) by compiling your embedded code with gcc (again, it doesn't matter what gcc target). That can take a little more effort to handle target-specific features (and don't try linking!), but it can be a useful aid that is free after the initial setup.

Reply to
David Brown

I firmly agree with Hans Bernhard that a Makefile should document what sources belong to a project.

If you are an old school make user like me you would say that even if a file is part of the project's source it shouldn't normally be in the source tree. The source tree should be empty most of the time, until make has fetched the sources from Your Favorite Source Control System. Every evening before you go home (or to bed) type "rcsclean" first.

So this is how it goes for "project":

mkdir project cd project ln -s $RCS/myproject RCS # ymmv make # Gets Makefile using default rules, then make default target .... # lots of fun # myproject has been built make ultraclean .. ls

RCS is still YFSCS for me. Like in: "ALGOL 60 was a tremendous improvement over all its successors"

Groetjes Albert

--
Albert van der Horst, UTRECHT,THE NETHERLANDS
Economic growth -- like all pyramid schemes -- ultimately falters.
 Click to see the full signature
Reply to
Albert van der Horst

What part of the above scenario was so hard to understand? I'm not talking about correctness, but about bit-for-bit equality of the generated file, which is still the fastest way on earth to guarantee that there has been no side effect by a supposedly benign modification.

You seem to be assuming that all linkers are controlled by linker files. Well, some aren't --- they are run entirely via the command line, which has its natural place of definition in the Makefile. And even for those that need a linker script, it's generally easier and more straightforward to have make generate the module list than to maintain it in a separate file.

Reply to
Hans-Bernhard Bröker

Ah, I didn't realise you were looking for bit-for-bit equality (which I agree is the most complete test for avoiding side effects). I have simply never been in the situation where I'm at such an advanced stage in development/testing/deployment that I'd need bit-for-bit equality, and at the same time contemplating a major change of tool versions. Changing the variant handling is more plausible, however.

In this case I agree that you want to be sure that the linker order is kept the same. Of course, a wildcard based system will probably keep them the same in practice. There may or may not be guarantees, but you are guaranteed the files linked are the same and if the resulting binaries are bit-for-bit identical, you have a match just as before.

If your linker is not controlled by a linker file (or if you prefer to add a few command line parameters rather than write a separate file), then obviously this information will be in the makefile. The makefile is already an important part of the source code for other things, such as compiler flags - I'm not suggesting that you don't treat it as an important part of the program. But I believe there is no point in manually writing out a list of modules for compilation or linking except when you have specific reason.

Reply to
David Brown

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.