Il 03/12/2018 12:57, David Brown ha scritto:> On 03/12/18 12:13, pozz wrote: >> Il 03/12/2018 11:06, David Brown ha scritto: >>> On 03/12/18 09:18, pozz wrote: >>>> What do you really use for embedded projects? Do you use "standard" >>>> makefile or do you rely on IDE functionalities? >>>> >>>> Nowadays every MCU manufacturers give IDE, mostly for free, usually >>>> based on Eclipse (Atmel Studio and Microchip are probably the most >>>> important exception). >>>> Anyway most of them use arm gcc as the compiler. >>>> >>>> I usually try to compile the same project for the embedded target and >>>> the development machine, so I can speed up development and debugging. I >>>> usually use the native IDE from the manufacturer of the target and >>>> Code::Blocks (with mingw) for compilation on the development machine. >>>> So I have two IDEs for a single project. >>>> >>>> I'm thinking to finally move to Makefile, however I don't know if it is >>>> a good and modern choice. Do you use better alternatives? >>>> >>> >>> I sometimes use the IDE project management to start with, or on very >>> small projects. But for anything serious, I always use makefiles. I >>> see it as important to separate the production build process from the >>> development - I need to know that I can always pull up the source code >>> for a project, do a "build", and get a bit-perfect binary image that is >>> exactly the same as last time. This must work on different machines, >>> preferably different OS's, and it must work over time. (My record is >>> rebuilding a project that was a touch over 20 years old, and getting the >>> same binary.) >>> >>> This means that the makefile specifies exactly which build toolchain >>> (compiler, linker, library, etc.) are used - and that does not change >>> during a project's lifetime, without very good reason. >>> >>> The IDE, and debugger, however, may change - there I will often use >>> newer versions with more features than the original version. And >>> sometimes I might use a lighter editor for a small change, rather than >>> the full IDE. So IDE version and build tools version are independent. >>> >>> With well-designed makefiles, you can have different targets for >>> different purposes. "make bin" for making the embedded binary, "make >>> pc" for making the PC version, "make tests" for running the test code on >>> the pc, and so on. >> >> Fortunately modern IDEs separate well the toolchain from the IDE itself. >> Most manufacturers let us install the toolchain as a separate setup. I >> remember some years ago the scenario was different and the compiler is >> "included" in the IDE installation. >> > > You can do that do some extent, yes - you can choose which toolchain to > use. But your build process is still tied to the IDE - your choice of > directories, compiler flags, and so on is all handled by the IDE. So > you still need the IDE to control the build, and different versions of > the IDE, or different IDEs, do not necessarily handle everything in the > same way. > >> However the problem here isn't the compiler (toolchain) that nowadays is >> usually arm-gcc. The big issue is with libraries and includes that the >> manufacturer give you to save some time in writing drivers of peripherals. >> I have to install the full IDE and copy the interesting headers and >> libraries in my folders. > > That's fine. Copy the headers, libraries, SDK files, whatever, into > your project folder. Then push everything to your version control > system. Make the source code independent of the SDK, the IDE, and other > files - you have your toolchain (and you archive the zip/tarball of the > gnu-arm-embedded release) and your project folder, and that is all you > need for the build. > >> >> Another small issue is the linker script file that works like a charm in >> the IDE when you start a new project from the wizard. >> At least for me, it's very difficult to write a linker script from the >> scratch. You need to have a deeper understanding of the C libraries >> (newlib, redlib, ...) to write a correct linker script. >> My solution is to start with IDE wizard and copy the generated linker >> script in my make-based project. >> > > Again, that's fine. IDE's and their wizards are great for getting > started. They are just not great for long-term stability of the tools. > >> >>>> My major reason to move from IDE compilation to Makefile is the test. I >>>> would start adding unit testing to my project. I understood a good >>>> solution is to link all the object files of the production code to a >>>> static library. In this way it will be very simple to replace production >>>> code with testing (mocking) code, simple prepending the testing oject >>>> files to static library of production code during linking. >>>> >>> >>> I would not bother with that. I would have different variations in the >>> build handled in different build tree directories. >> >> Could you explain? >> > > You have a tree something like this: > > Source tree: > > project / src / main > drivers > > Build trees: > > project / build / target > debug > pctest > > Each build tree might have subtrees : > > project / build / target / obj / main > drivers > project / build / target / deps / main > drivers > project / build / target / lst / main > drivers > > And so on. > > Your build trees are independent. So there is no mix of object files > built in the "target" directory for your final target board, or the > "debug" directory for the version with debugging code enabled, or the > version in "pctest" for the code running on the PC, or whatever other > builds you have for your project. Ok, I got your point and I usually arrange everything similar to your description (even if I put .o, .d and .lst in the same target-dependent directory). I also have to admit that all major IDEs nowadays arrange output files in this manner.
Anyway testing is difficult, at least for me.
Suppose you have a simple project with three source files: main.c, modh.c and modl.c (of course you have modh.h and modl.h).
Now you want to create a unit testing for modh module that depends on modl. During test modl should be replaced with a dummy module, a mocking object. What is your approach?
In project/tests I create a test_modl.c source file that should be linked against modh.o (the original production code) and project/tests/modl.o, the mocking object for modl.
One approach could be to re-compile modh.c again during test compilation. However it's difficult to replace main modl.h with modl.h from mocking object in the test directory. modh.c will have a simple
#include "modl.h"
directive and this will point to modl.h in the *same* directory. I couldn't be able to instruct the compiler to use modl.h from tests directory.
Moreover it could be useful to test the same object generated during production. I found a good approach. The production code is compiled all in a static library, libproduct.a. The tests are compiled against static library. The following command, run in the project/tests/ folder
gcc test_modh.o modl.o libproduct.a -o test_modh.exe
should generate a test_modh.exe with mocking object for modl and the
*same* modh object code of production.