The datasheets I read on the PIC MCU says the precision is 2% for most common voltage and temperature ranges (and 1% for the sweet spot). I assume the AVR MCUs have similar characteristics.
This should mean that for say 10 clock periods at 4 MHz the period would vary between 2450 - 2550 ns?
Is the stability higher for a shorter timescope wherein the temperature and voltage should not have the ability to change as much. Such that between two timepoints spaced 25 us apart. The period for 10 clockcycles would vary less than the specified 2% ..?
,,,,
For a rs232 byte transmission the dataspec precision should mean 2% per bit, and 21% for the full transmission? (maybe this is still acceptable for a pc serial port?).
Maybe it's possible to work around this by creating a bitbang protocol that resync with software at every bittransition such that any bitperiod will only deviate with 1-2%?
1 us = 10^-9 s