Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

clocks get used for two distinct purposes, often at odds:

- the measurement of durations.

- the presentation of some timestamp in a way that the reader has some intuition for.

that first purpose won’t be hurt by not tracking leap seconds. actually, a lot of applications will probably more accurately measure durations by eliminating leap seconds.

if leap seconds (or minutes) really are of critical importance, we’ll reintroduce them to the presentation layer. the thing is, very few people can tell the difference between 12:01 and 12:02 without being told the “real” time. so if you’re presenting a time which is “off” by a minute because there’s no leap seconds… does it really matter?



There should really be three "layers" of time indirection.

1) Seconds since 00:00:00UTC 1.1.1970. This value increases by 1 each atomic second and never goes forward/back. Call this Universal Monotonic Time.

2) The difference between when the sun is at its zenith at Greenwich and 12:00 UMT. Call this the Astronomic Drift.

3) The timezone - offset from Greenwich that makes the local clock sync up with astronomic time and also contains a DST offset if that location observes DST at that date.

By adding up 1) + 2) + 3) you end up with the "human time" at a given location at a given date.

A computer system should only ever store 1). Then, it can calculate human time when displaying it to humans.

I'm also a fan of having "local sun time" which would be the time according to the position of the sun in the sky, quantised to 15-minute slices (basically micro-timezones). It would be nice if office hours, school times, &c can be defined based on that, i.e. work starts at 9am local sun time, which will sync up better with people's biological clock and cut down on the yearly stress DST causes.


I agree with your separation between "duration" and "absolute point in time". But it doesn't solve the issue, because durations are often computed as the difference between two absolute points in time. You could get over this on your local machine with a local counter, but across network boundaries you need to rely on absolute differences.


> But it doesn't solve the issue, because durations are often computed as the difference between two absolute points in time.

other way around. e.g. unix time is determined by the number of seconds (as experienced by some point fixed to Earth, roughly) relative to some reference point. duration is the native format for most time systems, like UTC, because “absolute time” isn’t a thing that can be measured. faking the duration (e.g. adding leap seconds) within a time system is sort of nonsensical: we only do it to make translation across time systems (UTC, UT1, local/human timezones) simple. if that’s the justification for things like leap seconds, then better to keep the time system itself in its most native format (assuming that format is useful locally) and explicitly do those conversions only when translating, i.e. at the network boundary: when communicating with a system that doesn’t share/understand our time system(s).


I regret that the reality is such, but Unix timestamp are in sync with UTC so they too have gaps and reapeats (for positive and negative leap seconds)

The international standard for monotonic time is TAI, which never had leap seconds, but which is also used by almost no one




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: