Just a generic question regarding interrupt latency. I am using a periodic timer which generates an interrupt at every clock tick. The question I have is will the interrupt latency cause the timer interrupt service routine to run at a reduced frequency. What I mean here is as follows. Consider a simple isr routine as shown below,
void isr() { t = timestamp(); }
Assume the above code runs twice. The question i have is will the time elapsed between 2 calls of timestamp() in this isr include the interrupt latency value or will the time elapsed correspond to the original timer frequency (i.e. time_elapsed = 1/timer_freq). It would make sense that the isr still runs at the original timer frequency and the interrupt latency only delays the calling of isr and not the frequency at which the isr runs. However from the measurements it doesn't look like the case. Any help is appreciated.