I've always suspected there was something clever that I could use for a thing like this, but ... I was too afraid to ask about it.
Suppose I am on an extremely deterministic platform.
Suppose I can calculate the number of microsconds it takes to call something[1].
Suppose I can check and calculate things about this count every seconds.
Suppose also that there are two cases for "something" :
- There is nothing to do.
- There is something to do. "Something" is constrained to "one thing" by design.
Let's also assume I can (cheaply) calculate the minimum and maximum number of microseconds to do either of "nothing" or "something".
Obviously a histogram is in order. But wait!.
It seems to me that I should be able to estimate a histogram based on:
- minimum time
- maximum time
- average time.
What is this sort of ... inquiry called, and how do I stop being quite so dull about it?
[1] I suspect that one answer is to extend the number of buckets into more finely grained measurements of sub-operations of "something". Example: a check for "serial port has data" vs. "relieve one unit from serial port" vs the overhead from all the dance surrounding those two.