Time

Hi,

In keeping with the spirit of posing challenging (interesting?) questions...

Expressing (calendar) *time* to the user presents a dilemma. Most devices that are aware of calendar time also include provisions for *setting* (i.e., altering!) the time. And, there are few restrictions as to how and when the user can alter that time setting!

However, in terms of the machine's viewpoint, time is (must be!) a monotonically increasing function. The "time setting" is just an arbitrary parameter established for the convenience of the user -- it bears no relationship to "real" time.

But, when the user expresses a temporal event, he does so in *some* context (this will prove to be the essence of this question). Yet, the machine may be operating in a *different* context -- or, simply impervious to the subtlety of the user's viewpoint!

For example, if the user schedules an "appointment" for "3:00PM Tuesday", chances are, he *means* "3:00PM" in some absolute sense. OTOH, if it is 2:00PM presently and he wants to do something "in about an hour", he

*says* "3:00PM" but really intends that to be "60 minutes from NOW".

The problem lies in the fact that the time associated with "NOW" is always vulnerable to the user (re)setting the (time-of-day) clock! So, if he happens to realize he needs to set the clock forward one hour (Springtime DST processing) NOW has suddenly *become* 3:00PM -- time for that event that he scheduled for "in about an hour".

I've "solved" this problem by never letting the user change the machine's notion of "current time". Rather, he can determine the *bias* that should be applied to the time that the machine *reports* to him, but not the time itself.

(this eliminates a whole class of potential problems that can occur with the user altering the time setting wantonly -- at least the machine will be sane in how it deals with those temporally ordered events)

I log "time changes" (i.e., "bias adjustments") so I can translate the "system time" into whatever "local time" was in effect at the time. E.g., I can tell the user that he changed the "clock" at 2:34PM on January 13th to read "3:12PM", instead. And, that at 3:13PM (exactly *one* minute later!) he changed it to read "3:00PM" (for whatever reason).

From the machine's point of view, this is a boon. The machine's notion of time remains unaltered. It doesn't see those time changes as happening at 2:34, and 3:13 but, rather, some time X and some time X+1. And, the time one minute thereafter is not "3:01" but, rather, X+2.

The problem lies in relating these time changes to the user's "context".

Consider how you might relate a log of his "time changing activities" to him if he choses to examine it at "3:01" on the day in question:

- the first change appears to have happened ~30 minutes ago (at 2:34) even though it actually was only two minutes ago!

- the second change appears to have happened ~10 minutes in the *future* (at 3:13) almost 40 minutes after the first change -- even though it happened just *one* minute ago.

Of course, these types of things happen all the time when you play with the time on a desktop machine (probably not as noticed on Windows machines as they don't tend to generate ongoing logs of events that would manifest these "anomalies"). On a server, changing the time can cause you some unexpected grief (e.g., if a cron job doesn't run because you "skipped over" its scheduled activation time). But, chances are, you are skilled enough to compensate for this

*if* it is important.

But, how do you deal with this in an *appliance* used by "nominal users"? To date, most such appliances/devices haven't had the level of sophistication to support scheduled events (other than, perhaps, reminding you of a pending birthday, meeting, etc.) of any substance. What happens when the *device* needs scheduled activities for its own maintenance/integrity?

One approach is to schedule all of those using "system time" so that changes to "user time bias" do not affect it. This works great if you can hide the activities from the user. But, if you have to disclose them to the user, you run the risk of adding the same sort of confusion.

An even bigger issue (that I won't get into as I haven't any *good* idea of how to solve it) is how you tell the user about his *past* actions and *future* plans in the face of varying "time biases".

As I said, an "interesting" problem without a clear cut answer. :-/

Reply to
D Yuniskis
Loading thread data ...

[%X]

I am with you on the notion that the underlying "System Time" should be kept as constant as possible. I aim to keep this at UTC and it can help if the correctness of this is synchronised with external standard time references.

I am also with you on the user having whatever time-frame reference he/she desires. Quite pertinent for the travelling person who might visit many time-zones. I don't think that they would want anything other than the local time wherever they happen to be at the time. Helpful if the item had reference to a GPS and could determine the local offset for them, then the user needs not be bothering with time settings at all.

As for logging of "Time-change" events, not something I would consider relevant if you can automatically determine correct UTC and local zone. It might be pertinent to server farms but I would expect them to be at UTC anyway.

--
********************************************************************
Paul E. Bennett...............
Forth based HIDECS Consultancy
Mob: +44 (0)7811-639972
Tel: +44 (0)1235-510979
Going Forth Safely ..... EBA. www.electric-boat-association.org.uk..
********************************************************************
Reply to
Paul E Bennett

wow, i would hate to be the person who asks you to make a cup of tea in the office, must take hours.

Reply to
bigbrownbeastie

What to do with scheduled activities if the clock was adjusted depends on the nature of those activities. You can't make universal solution at the system level. So, if the clock was adjusted, OS issues a system message about it and lets the processes decide if they should do something.

Vladimir Vassilevsky DSP and Mixed Signal Design Consultant

formatting link

Reply to
Vladimir Vassilevsky

There's a subtle difference, here. I'm saying "system time" is totally irrelevant to "calendar time". I.e., system time has guarantees that calendar time does not (or NEED not).

For example, you *know* that system time N and N+1234567 are exactly 1234567 seconds apart. For *all* values of N.

I see synchronizing to UTC (or any other standard) as just another "user time bias" adjustment. UTC exists solely for the convenience of users who want to think in terms of UTC.

Yes, but this is still just a "user bias". Or, a user *preference* if you want to look at it that way. I.e., like picking what *font* to use to display the time (assuming you *do* display it).

The system shouldn't (can't?) care if the user wants to spend his entire day setting the clock forward, then back, then forward some *different* amount, etc.

If you don't log time change events, then you have to log the "current time" in any messages that you emit. And, the user has no way of *ever* knowing the temporal relationship between any two of those messages -- since the time can be changed between them (and you've decided not to log these events!).

Reply to
D Yuniskis

That's how I handle things currently. But, it only works for running processes.

What I was looking for (i.e., the purpose of my question) was insight in how people *think* about time (as users) and what characteristics of a "context" affect that thinking.

E.g., scheduling a (doctor's) appointment happens in an "absolute time context" but setting an alarm when baking

*cookies* happens in a "relative alarm context". I can come up with numerous examples for each but haven't been able to come up with a criteria that can be used to categorize them.

For example, an initial criterion that I toyed with was the "temporal distance" -- things in the near future tended to be relative; those "further out" (whatever that means) tended to be absolute. But, deciding where that threshold was proved to be just as hard a problem to solve. And, I can come up with counterexamples of each.

Note that the application can't unilaterally make this decision, either. E.g., the two examples above would easily fit into the same application.

I.e., this is not a *technical* problem -- I've already "solved" that aspect of it. Rather, it's a "human problem"... figuring out how people *think* about time.

Reply to
D Yuniskis

The process tells the system that it should be notified about the time changes. So, even if the process is not running, the system could activate it.

Where is a problem? There are many "relative" and "absolute" qualities besides time. You deal with relative or absolute time just like you deal with relative or absolute path, so to speak.

Vladimir Vassilevsky DSP and Mixed Signal Design Consultant

formatting link

Reply to
Vladimir Vassilevsky

The UTC is not a linear time scale due to the leap seconds that are added at random intervals (3, 6, 9, 12, 15 .. month) due to the slowing down of the rotation of the Earth. When the leap second is added, the clock should count 23:59:59, 23:59:60, 00:00:00 ....

Paul

Reply to
Paul Keinanen

Many, many bugs have been caused by this kind of assumption. Perhaps the most notable example is the lbolt variable inside traditionally structured Unix kernels. I'm most familiar with it from dealing with SCO Unix in the past but ISTR it affected other systems too, mostly with third party drivers where the develpers were not wise to the problems.

On SCO the lbolt variable was initialised to 0 at boot and incremented every 10ms after that. After a little over 8 months the variable overflowed and turned negative. Problems this could cause drivers ranged from garbling I/O during the overflow period to panicing the system. Of course, a problem that does not occur for 8 months is easy to slip through testing unless you are specifically looking for it.

Although the exact example you give is valid, closely related examples may not be. You can't assume that if N is now then 12345.67 seconds time will be represented by N+1234567. Nor can you even assume that if A is less than B, A must represent an earlier time, even if we know from context they can be no more than a few seconds apart.

--
Andrew Smallshaw
andrews@sdf.lonestar.org
Reply to
Andrew Smallshaw

My point is that by *defining* system time to be a monotonically increasing, unresetable function, you *can* make this assumption for "system times". I.e., if you look at the system time at any arbitrary time, T1, and then look at it at *any* arbitrary time thereafter, T2, you *know* that T2 > T1. By definition. :>

Thats a different problem. Pick a data type that is large enough to represent the data you intend to store in it! (e.g., you wouldn't pick an 16 bit integer to hold the world population!)

Yes: N+12345.67 (if you deal with fractional seconds)

But that is my point! In my scheme, these are invariants! The system time starts "whenever" (the instant the product enters final testing during manufacturing). Thereafter, it can not be altered -- except to add one second to its present value.

Barring hardware failures (batteries dying, alpha particles, device being run over by a car, etc.) all of the above "guarantees" exist.

Reply to
D Yuniskis

POSIX *used* to allow for *double* leap seconds. I think that capability has been removed.

Note that leap seconds can also *subtract* a second (though this has never? happened)

*All* "human formatted" time schemes are PITAs for machines. The only realistic way to track time is just to count seconds. (or some other time unit having the granularity you desire).

Barring distortions in space-time, one real second *takes* one real second to transpire. :>

Reply to
D Yuniskis

IMHO, people think about time in two ways: As intervals and as absolute date/time occurences. Absolute times can be both near and far in temporal distance---as near as the starting time for a televised football game that will begin in a few minutes, or as far as your parent's 50th wedding anniversary in 12 years, 3 months and 14 days.

Intervals can also be long or short: From the 4-millisecond exposure time for a digital camera to the end of the 7-year warranty on your Prius battery pack. (For intervals above an undefined limit, it probably makes sense to convert the end of the interval to an absolute time.)

For short-term interval measurement, there's really no need to use a system time variable at all. It's often simpler to just increment a delay-specific variable of the appropriate bit length with a timer tick. After all, when you're setting the baking time for your brownies, you don't want to go to the trouble of adding 30 minutes to 6/11/2010 23:47:33 and set that value into the timer. You just want to twist the knob to the right until the arrow points at '30'. ;-) OTOH, if you want your alarm clock to go off at 7:00AM, you don't want to figure out the difference between now and that time, and set that into a countdown timer.

Then there is the subset of timing problems like a desire to have the alarm go off at 7:00AM every day. The first setting is an absolute time---but the following settings are intervals of 24 hours.

I guess my conclusion is that one time system won't work for all problems. As for picking the right system---it probably depends on whether the question you ask the user is "How Long...." or "When do you want to....". The second question implies that you and the user have some common time reference.

The technical problems only get worse when you think about multiple systems,each with a unique system time, having to accomplish some action at the same instant----without the capability to communicate with the other system(s). I face this problem often with moored oceanographic instruments. My solution has been to use low-drift clocks and synchronize the logger clocks to UTC.

Mark Borgerson

Reply to
Mark Borgerson

Exactly.

Of course! If I am flashing a lamp a 1Hz, I don't keep setting alarms for "now+1sec".

Yes. The problem lies in how you communicate this to the user (i.e., how you ascertain the *user's* idea of whether he is thinking in relative or absolute terms).

Generally, I don't ask users questions. Instead, I try to glean what they want from their actions. (this makes a friendlier user interface; one where the user feels in control and where the "device" isn't intruding on their way of doing things but, rather, *assisting*).

But, think of how *you* are often driven by other people and events. E.g., a doctor sets up an appointment for you to come back "in two weeks". Yet, they pen the appointment at a *specific* time/date. To you, that is an absolute time reference (you are expected there at that time/date). Yet, to "the system" (the system being the doctor, you, society, etc.) it really is a relative appointment. I.e., if the doctor had to reschedule, he would have knowledge (unbeknownst to his appointment book) that

*exactly* two weeks from "now" is just a guideline...

Nope. What do you do when DST kicks in? The difference will be 24 hours nominally but may be 23 or 25 depending on whether it is Spring/Fall time change. (see my point? How easy it is to miscategorize time events as absolute vs. relative?? Imagine a machine trying to deduce what you mean...)

What happens if the user answers the first question with a reply "until 3:00PM" (e.g., "keep the irrigation water flowing until 3pm" instead of the "for 2 hours" that your question was designed to elicit) or the second question with "in three hours" instead of the "at 2:00PM" that you were expecting?

(yes, I can see how to *use* either of these specifications. But, you have to be able to react to how the user wants to express the time and still not necessarily know if he is being absolute or relative. People can too easily think in either basis and express time either way. E.g., think in terms of reltive time yet express an answer in absolute time or vice versa.)

I think these are easier to handle because they, by definition, are all thinking in terms of *some* time standard. Then it is just a question of how you provide that same time reference to each of them. In my scheme of things, this just looks like another "time bias".

E.g., the user can deliberately set the "user time bias" to be "five minutes fast" (how many folks do this with their alarm clocks in the morning?). Yet, you could keep your notion of "UTC time bias" independantly of his. And, a "system time" that provides a common timescale against which both are metered.

Reply to
D Yuniskis

Not at all. MET is not TOD. And you have not, in your little discourse, defined absolute rules about what "now" or "an hour from now" means. I might be sitting at my desk in NY at 0700 local TOD, making an appointment to meet you in CA at 1700 local TOD; I certainly don't mean "ten hours from NOW", I mean 13 hours from my current NOW, which is 10 hours from your current NOW.

Reply to
larwe

The executable may not be available, etc. Yes, there are ways around this. E.g., create wrappers for those things that aren't available, ensure the wrappers are

*always* available and have *them* keep getting "reawakened" periodically *or* in response to an event that satisfies their prerequisites (e.g., "SD card inserted" -- perhaps this card will have the executable that I need??).

But, as my comments (elsewhere) should indicate, it may not even be possible for that application to *know* how to deal with the time change. E.g., the user's notion of current calendar time may have changed but *relative* time progresses at the same pace; how do you know which times are absolute vs. relative?

Which ones do users routinely specify in an embedded device? "My phone number is -32"? "My address is +Four roads"? "My father is -hxty"? "My favorite color is +300"? "My shoe size is -2"? "I need to buy -6 apples"? "My car gets +1.3MPG"?

Dealing with absolute vs. relative is trivial. *Knowing* which is which is hard -- unless you expllicitly *ask* the user each time he specifies a time/date: "do you really mean XXXX or do you mean YYYY from *now*?"

Reply to
D Yuniskis

This is my point:

D Yuniskis wrote:

It depends. In embedded devices there is no such thing as "routinely".

How about "if a == b and if today is Friday, then if the LED#0 was turned on at 6:00, then turn on LED#1 in 5 minutes after LED#0 is off" ?

You can not create universal abstraction for time. Therefore each particular process has to deal with the time explicitly; in its own way.

Vladimir Vassilevsky DSP and Mixed Signal Design Consultant

formatting link

Reply to
Vladimir Vassilevsky

The end of the warranty on the battery pack was relative until you bought the Prius. Then it became absolute.

Mel.

Reply to
Mel

No, it isn't - precisely because the word is ill-defined. If I say I want you to do something at 9am tomorrow, and tonight is the DST changeover, I mean 9am.

Reply to
larwe

You're only *beginning* to see the depth of the problem!

NOW is always "now". "This instant". Everyone has a coincident "now" -- regardless of what the time on their wall clock says.

"An hour from now" is exactly that 3600 seconds from "now". We will all experience "an hour from now" at the same instant in time -- regardless of where we are (on this planet).

Your NY/CA example doessn't go far enough. If we agree to meet at 1700 in CA, then CA local time applies. Presumably, my timepiece would reflect CA local time when I am *in* CA. But, what happens if I opt not to travel to CA? What (local) time must I place a phone call to you if the meeting is to be held as a teleconference?

How do we know that our individual notions of PST and EST are synchronized? What if I tend to set my clock 5 minutes fast? Etc.

The point is, time only makes sense when given a reference along with the time. Whether that is relative or "absolute". But, people almost *never* express those references when they are speaking about "times". So, how do you *infer* it? Or, present an interface to the user whereby he can specify it without feeling encumbered?

"Do you mean relative time or absolute? Do you mean local time or for some particular timezone? Do you think of that time as referenced to *your* current notion of the 'current time' or some more universally acceptable reference?"

Reply to
D Yuniskis

[snipped]

/au contraire/ the difference is 24 hours and _nominally_ will change depending upon Spring/Fall time change. That's why UTC is your friend ;-)

While we're 'eliciting' requirements from the user, in its space the two ways are interchangeable as much as we can say 19:15h or 'one quarter past seven', once you have your _implementation_ one of them will be more feasible or perhaps only one of them may be possible!

--
Cesar Rabak
GNU/Linux User 52247.
Get counted: http://counter.li.org/
Reply to
Cesar Rabak

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.