The timekeeping in computer systems worldwide is not infinite. In January 2038, it will reach its limit, much like water reaching the brim of a glass. Does this signify that computers predict the end of the world? Or is it merely a minor technical glitch? Apocalyptic events are often foretold by mystics, fortune tellers, astrologers, and occasionally scientists, but now it seems computers are autonomously predicting such an event, without human intervention. The anticipated Epochalypse has already been dubbed a failure. But what does it mean for time to run out?
When you leave work and shut down your computer (a step many overlook, leading to unnecessary electricity consumption), you’re engaging in a routine. In the morning, you revive your car, and it’s no surprise when it displays the correct time. However, this concept of machine time will cease on January 19, 2038, at 3:14:07 a.m. It’s not just your computer; it’s universal. Time, as we know it, cannot be purchased, taken, or borrowed from that point on. There appears to be no mysticism involved.
Within any device, regardless of the operating system—be it Windows or iOS, and whether it’s a desktop, laptop, or smartphone—lies a system, a small chip that counts the seconds. It ticks away, day and night, through all seasons, powered by a battery. It can sync with the correct time via the network (as phones do), or operate offline (like older computers). But ultimately, none of this will matter.
What truly matters is this: all computer clocks globally count seconds from January 1, 1970. It’s not as if when you purchase a new laptop, the internal counter starts anew. Whether it’s a new or old device, the countdown is universally from this historic date.
Computers convert seconds into years, months, weeks, and days, displaying the date in a format we recognize. Internally, however, they track only seconds. For instance, if your car was manufactured yesterday, it calculates the current date based on the elapsed seconds since its programmed starting point.
This internal clock is limited to 32 bits, which means it can count up to approximately 2.15 billion seconds (exactly 2,147,483,647). Interestingly, on September 9, 2001, a billion seconds had elapsed since January 1, 1970, though this milestone went largely unnoticed.
The future behavior of computers is uncertain once they reach this limit. Some speculate that they might display December 13, 1901, which is over 2 billion seconds before January 1, 1970. However, this is speculative. It’s more probable that they will simply display an error screen.
Have we been through this?
Perhaps we haven’t alarmed you excessively. After all, the infamous Y2K problem, which is now often mocked, ended well. At the time, there was no room for humor. It’s commonly said that computer scientists fabricated the crisis to extract money from companies. However, this isn’t true.
Computer protocols established in the 1970s (before the era of personal computers, yet when computers were already powerful and programming advanced) dictated coding the year with its last two digits. For instance, the year 1991 was coded as ’91’. The possibility that this system would fail after the year 2000 wasn’t considered; it seemed too distant, a problem for the future to solve. But as the new millennium approached, a swift resolution was imperative. The cost for global companies to fix the issue was a staggering $300 billion (according to the exchange rates of the time).
Computer experts set out to revise the codes. The task was straightforward but monotonous. Previously, to conserve valuable operating memory, dates were recorded in six digits (e.g., August 20, 1980, was ‘0832080’), not the eight digits required (as ‘08201980’). The challenge was to insert the missing two digits throughout the systems.
On December 31, 1999, John Koskinen, the chair of the American Y2K Commission, along with journalists and computers, took a flight to regions where the year 2000 had already arrived. Despite minor incidents, such as a heating system failure in South Korea (where heating was computer-controlled at the time), major disruptions were avoided.
The Year 2000 issue was manageable; it presented no enigma but required considerable time to resolve. The Year 2038 challenge, however, is distinct. Over two decades ago, the solution involved correcting numerical values. Today, it demands comprehension and the allocation of time from other tasks.
Yet, is this truly a concern? The clock of spring will continue its rhythm. Time, as a physical phenomenon, remains constant. No singularity will occur. Indeed, this is a problem, and the reasons are as follows:
The entire system requires a transformation
The term “Unix era” refers to computer time. Unix is an ancient family of operating systems, developed in the early 1970s by Ken Thompson and Dennis Ritchie at Bell Labs, with a 32-bit counter at its core.
The issue arises when considering a switch to 64 bits. A simple calculation indicates that a 64-bit clock’s seconds would suffice until the universe’s heat death (292 billion years).
However, the transition is not as simple as replacing a 32-bit chip with a 64-bit one by opening the computer’s back cover. Many codes and settings are based on a 32-bit standard. It’s not like in 1999 when you could solve problems by just adding two digits. Nevertheless, every problem has a solution.
The creation of new operating systems is necessary, with a completely different Windows, a different iOS, and other unique platforms. New hardware must be produced since not all computers can physically support a 64-bit architecture. This hardware must be sold or distributed to billions of people in need worldwide. Programmers and network administrators need to be trained with clear instructions. All computers worldwide must be stopped simultaneously, ensuring planes land, and ships and trains decelerate. Then, remove all the old systems, install the new ones, initiate the entire setup, and proceed, accepting that errors may occur.
Is it realistic? On the current planet, you understand that it is not.
Let’s ask the stars
If you possess a modern computer, it’s likely that your operating system supports 64 bits. Its counter will continue to tick past the year 2038. But the question remains: will it display the years and months? Converting seconds into dates is essential. Computer scientists consider it improbable. Despite the 64-bit system, the underlying calculations from seconds to years adhere to a 32-bit framework. It’s akin to gasoline with additives or milk with boosted fat content: the numbers may appear higher, but the substance remains unchanged. Naturally, this is a simplification, so experts, please hold your kilobits.
Every problem has a solution, including this one. Numerous operating systems are devising and implementing hybrid solutions to circumvent incompatibility through various methods. It’s a case of something being better than nothing.
Certain techniques merit discussion. For instance, the creators of the OpenVMC operating system have incorporated principles from high astronomy. Did you know that astronomy faced and resolved a similar issue long ago? Astronomers frequently need to calculate the time elapsed between two dates, such as January 3, 675, and September 13, 2024. Each day, starting from the distant past (January 1, 4713 BC), receives a sequential number. These numbers are listed in a table, which is now digital. To find the time difference, one simply needs the numbers for the two dates and to subtract one from the other.
The year 4713 BC was significant because the astronomer Scaliger in the 16th century recognized that on this date, three important cycles—15, 19, and 28 years—converged, simplifying scientific calculations. Without actual counts, Scaliger used the product of these cycles to determine a Julian date for any moment before AD 3267.
Developers found this approach ingenious. They selected November 17, 1858, the 2,400,000th Julian day, as the start of their system. By omitting the zeros, they could fit the Julian day into a 32-bit space for seven centuries, with room to include hours, minutes, and seconds. This system, after adjustments, could count seconds up to 31,086.
Overall, computer scientists in 1970 should have consulted astronomers. They gambled on a chance, and the cost of this gamble has now reached astronomical proportions.
Does NASA possess undisclosed knowledge?
What if our rational explanations are entirely incorrect? What if computers, as electronic devices, have designated 2038 as the end of time for a specific reason? Could it be that time will cease to exist? Are all these explanations merely attempts to rationalize an inevitable event?
Conspiracy theories abound, and one such theory exists regarding the year 2038. If one searches for “the end of the world in 2038,” numerous results appear, some even suggesting NASA has predicted this apocalyptic event.
NASA recently conducted a simulation to assess Earth’s response to an asteroid collision. The scenario depicted a large asteroid striking Earth on July 12, 2038. Due to NASA’s history of unclear communications, some people misinterpreted this as a prediction of an actual asteroid impact in 2038, necessitating further clarification from NASA.
This led to speculation that computers, with their refusal to count beyond a certain date, are aware of something significant. And with NASA’s involvement, it seemed to confirm the supercomputer’s prediction of a catastrophic asteroid impact in 2038, signaling the end of time.
However, this conspiracy theory holds no merit. Yet, it raises questions: Why did NASA select this particular year for their simulation? Is it mere coincidence, or is there more to the 2038 problem than just bits and seconds, which are not to be underestimated?