Insights

Publication | Legaltech News

Nervous System: Y2K Revisited

David Kalat

December 11, 2023

The original Y2K was resolved thanks to an estimated $100 billion worth of diligent effort by dedicated computer engineers dutifully rewriting affected code behind the scenes. A similar issue will arise in 2038—the Y2K38 problem, as it were—and will yield to the same solution.

For many people today, the notion of “Y2K” feels like a joke. As the twentieth century drew to a close, many systems still depended on decades-old software code that had allocated only two digits to hold a year. At midnight December 31, 1999, those digits were set to roll over to “00,” leaving an ambiguous interpretation. For those people who are too young to remember it, the very idea that anyone was afraid a calendar turnover might threaten a computer apocalypse seems absurd. For those of us who lived through it, the memories may seem like an embarrassing hangover. Lost in the shuffle, though, is the fundamental oddness of how computers treat dates. It is one thing, as a human, to place some sense of emotional or spiritual significance to a date like January 1, 2000—but what does it mean when the computer thinks the date is 946684800?

There are an almost unlimited number of ways for humans to record dates. In the US, people typically write them in a MMDDYYYY format, whereas in Europe the convention is DDMMYYYY. Neither one of these formats is particularly helpful when it comes to sorting events chronologically, since July 1, 2021 is alphabetically before June 1, 1978, but chronologically after.

Using a YYYYMMDD format would seem to be a more useful way of storing dates, in order to ensure they sort chronologically. A closer look reveals this has some unexpected inefficiencies of its own. The first four characters, YYYY, change only once every year. The month only changes every thirty days or so.

If one had a fixed reference point, then the system really needs only to count days. Any time the system was queried to provide the date, it could do a quick calculation of how many days had elapsed from the starting point and then work out what month and year that would be. Computers are good at doing calculations quickly—the bigger drain on system resources is having to manage inefficiently stored data.

Most date formats in use in modern computer systems use some variation of this method. Time is usually stored as a single numerical value representing how much time has elapsed from some arbitrary, but consistent, starting point called an “Epoch.”

The first implementation of this method was engineered for the UNIX operating system by Ken Thompson and Dennis Ritchie. Working in 1969, Thompson and Ritchie settled on 00:00:00 January 1, 1970 as their Epoch reference point. UNIX systems would count time in seconds from that moment forward (or backward, as needed).

Other operating systems, later, engineered their own as well. For instance, Microsoft NTFS timestamps are stored as a 64-bit integer counting 100-nanosecond intervals since January 1, 1601. Thanks to the broad reach of UNIX at a critical formative moment in the emerging computer age, however, the UNIX Epoch settled in as a default standard for time measurement across many different platforms and systems.

The notion of counting time from a common starting Epoch is not an invention of computer science, but is the basis of human calendars. Y2K happened at the specific point in time that it did because the Gregorian calendar used in most of the world had arrived at a major turning-over point. The value of “2000” for the year resulted from the Gregorian calendar’s way of counting an average 365.2425 days a year from a somewhat arbitrary Epoch that was intended to represent, give or take, the birth of Jesus Christ (although it has not been in use since that moment, and was only instituted in 1582).

Other cultures around the world had different ideas of what the “Y2K” New Year actually was. The Hebrew calendar celebrated the arrival of the year 5760; the Chinese lunisolar calendar anticipated 4697; in Ethiopia it was about to turn 1993.

Crucially, however, whereas humans can just keep adding new digits to expand the calendar’s reach, in a computer Epoch system there is a maximum value available for storing the date. The UNIX Epoch uses a 32-bit integer to count seconds from January 1, 1970. In mathematical terms, this means there are a maximum of 2 to the power of 31 individual binary values available to count non-negative seconds. That is an hourglass that will run out after sixty-eight years. At 3:14:07 AM GMT on January 19, 2038, the UNIX Epoch timestamp runs out of new values and resets to zero.

This raises the prospect of Y2K happening all over again.

The original Y2K was resolved thanks to an estimated $100 billion worth of diligent effort by dedicated computer engineers behind the scenes dutifully rewriting affected code. The Y2K38 problem, as it were, will yield to the same solution. Before the 32-bit integer is exhausted, the relevant software needs to switch to something else, such as a 64-bit integer.

Speaking mathematically, there are 2 to the power of 63 non-negative seconds one could count using a 64-bit architecture. That provides for over 9.2 quintillion seconds, or over 292 billion years. Since this exceeds the estimated age of the known universe by more than twenty-two times, it is safe to say that switching to 64-bit timekeeping will not require another fix anytime soon.

The views and opinions expressed in this article are those of the author and do not necessarily reflect the opinions, position, or policy of Berkeley Research Group, LLC or its other employees and affiliates.

BRG Experts

Related Professionals

David Kalat

Director

Chicago