I'm doing an embedded systems project which consists of taking input from the user via simple buttons, and giving output in the form of lighting LED's. So far, I've written the program as fully-portable C89, and I intend to keep it that way as much as possible. Obviously, I'll have microcontroller-specific parts to it such as:
void SetPinHigh(unsigned) { /* Must call microcontroller-specific library functions or something here */ }
, but the rest of my program calls these "wrapper" functions so I can keep the bulk of it fully-portable.
Anyway, I've come to a point where I need to introduce delays, and I again want this to be fully portable. The delays will be in the region of milliseconds (typically 250ms, eg. for flashing LED's).
I had considered using a macro which indicates the "Floating Point Operations per Second" for the given hardware setup, and then doing something like:
void Delay(unsigned const fraction_of_full_second) { long unsigned amount_flop = FLOPS / fraction_of_full_second;
float x = 56.3;
do x /= x; while (--amount_flop); }
(I realise I'll have to take into account the delays of looping and so forth)
Is this a bad idea? How should I go about introducing a delay? Must this part of my program be non-portable?
Martin