You can probably get that - tcpdump (well, windump) reports packet times in microseconds. All the digits change but no idea what the increment or accuracy actually is.
The old 8253 chip in the PC is clocked at 4.77 mhz / 4 or 1.193 MHz giving a theoretical resolution of a bit less than a microsecond (or perhaps half that rate, depending on the minimum count), if you either reprogram the interrupt to happen that fast or figure out how to read out the running count. Usually the interrupt was at the maximum 16 bit divisor of that - 18.2 Hz, but many programs sped it up.
There may be other clocks in more modern designs?