Time in programming relies on two core concepts: durations (e.g., 9.69 s) and instants (specific moments) measured as elapsed seconds from a reference epoch.
Absolute time (Unix time, Apollo Time, etc.) treats instants as numeric offsets from an epoch, simplifying arithmetic and comparisons.
Civil time (like the Gregorian calendar) maps instants into human-readable datetimes (year, month, day, hour, minute, second) but introduces ambiguity because periods (days, months) vary in actual seconds.
Atomic clocks and the UTC standard define a stable SI second, and leap seconds are inserted irregularly to keep civil time aligned with Earth’s rotation, affecting precise duration calculations.
Timezones are rules mapping UTC offsets to regions; they change arbitrarily (daylight savings, political decisions), causing non-existent or duplicated local times.
Computers convert between absolute and local time using two functions: local_time = f(timezone, absolute_time) (always 1:1) and absolute_time = f(timezone, local_time) (0/1/2+ mappings due to DST shifts).
The IANA Timezone Database tracks historical and future rules since 1970 for precise conversions; each IANA zone groups regions with identical past, present, and future rules.
When building software, bundle and periodically update the IANA database, store both user-input local datetimes and computed UTC, and reconvert after timezone rule changes.
Practical examples include chat timestamps (record UTC, render local), event planning (store local intent vs absolute instant based on user expectations), and personal tooling using libraries like Howard Hinnant’s date.
Key takeaway: “Just use UTC” is oversimplified. Handling time correctly requires understanding epochs, civil time, leap seconds, timezones, ambiguous intervals, and maintaining up-to-date timezone data.
Get notified when new stories are published for "General AI News"