While the _mean_ solar day is exactly 24 hours or 86400 seconds, the actual solar time as measured by a sun dial or more scientifically using a telescope to detect when the sun is in south (actually in the meridian, depending on hemisphere) is not constant, but can vary nearly a minute, causing +/-20 minute cumulative difference between actual solar time and mean solar time twice a year and slightly smaller peaks in between.
Before railroads and telegraphs, each city used their own mean solar time. There was a small village outside London with a naval observation, which their own local _mean_ solar time, which later became GMT (after changing definition from noon-to-noon to midnight-to-midnight :-).
Getting *a* date/time (UTC) is easy! It is then up to the devices themselves to adjust for "local conventions". They would have to periodically update their time zone tables based on geographic and legislative changes. E.g., here (US) there are regions of the country that do NOT observe savings time. And, there have been times when legislation has *altered* the observance of said time changes.
In short, you can't code one solution and hope it will always work. Nor is there a "standard"/service that would provide this information to you.
As a (non-portable) *hack*, you could query google for "current time" and "" and parse the result. But, there's no guarantee that google will continue to support this query.
Either allow them to submit a query to you similar to the google one outlined above; or, push updates to the time zone tables periodically (folks who don't connect to you "often enough" don't get the benefit of having the most recent tables)
[BTW, this can also handle leap second scheduling]
Personally, I find it easier to just maintain a monotonically increase count of "elapsed time" (since some arbitrary epoch) that devices use to know "what time it is". I can tweek the timebase dynamically to track "the passage of time" more accurately (i.e., so one "count", on average, corresponds to the time required for light to travel 299,792,458 meters) but always ensure that "future >= now >= past".
How this notion of time is presented to the user is then completely arbitrary (i.e., defined by the application). So, the fact that user time exhibits "discontinuities" is an issue that the *user* deals with and not the device.
[E.g., user could decide -- erroneously -- to set the time/date to Jan 2, 2035 4:23PM at THIS INSTANT! Is this "wrong"? Is setting your alarm clock "5 minutes fast" also *wrong*??]