The 'millennium' Y2K is about to repeat

In 2038, many computers and cell phones may experience the same error as the famous Y2K incident when the computer incorrectly recognizes the time and returns to the landmark in 1900. However, computer experts also accept determined that this incident will not be as serious as the Y2K because the technology community has 24 years to solve this problem.

The Y2K incident (aka the 2000 incident) occurred in the late 1990s when computer experts accidentally discovered that by 2000 the computer timer would recognize it as in 1900. That was because only two-digit programming was used to represent the day / month / year. Therefore, January 1, 2000 (January 1, 2000) will also be expressed in the same way as January 1, 1900.

Before 2000, there was a lot of speculation about the "catastrophes" caused by the Y2K incident, such as trains deviating from rails, aircraft crashed due to disturbed computer systems. Some even predicted that turbulence data would cause food scarcity and nuclear missiles would start automatically. But fortunately, experts all over the world have been able to fix computer systems so there has been no significant disaster when the world entered 2000.

But 24 years more people have to overcome a similar problem, called the incident 2038 . This problem affects software that uses 32-bit integers to store information. In 1970, a group of the first UNIX computer operating system development engineers decided to represent time in 32-bit integers, and began counting from January 1, 1970. Time coding The time on UNIX systems was then widely deployed to all software and hardware systems that needed time measurement.

Picture 1 of The 'millennium' Y2K is about to repeat

However, the 32-bit integer coefficients only represent a maximum of 2,147,483,647. On January 19, 2038, 2,147,483,647 seconds will be counted from January 1, 1970. At that time, all 32-bit computer systems would boot back to the landmark in 1970.

To understand the number 2,147,483,647, let us imagine: the largest number can be denoted by 1 digit is 9. The largest number can be denoted by 2 digits which is 99. We are representing children number on decimal (10), so 2 digits can represent all numbers from 1 to (10x10) -1, ie 99. And 3 digits can represent 1 to (10x10x10 ) -1 equals 999.

The binary system that the computer uses also has the same expression, but instead of relying on 10 numbers, it is based on 2 numbers (root 2), ie 0 and 1. So, with a binary the original 32 bit 2, the largest number it can represent is (2x2x2 . x to 31 times number 2) -1 equal to 2.147.483.647.

According to Professor Jonathan Smith of the University of Pennsylvania's Department of Computer Science and Information, this is a real problem: "Most UNIX-based systems using 32-bit clocks start timing. Since January 1, 1970. So, 68 years later, in 2038, time memory will overflow, at which time the clock may stop working, timetable applications, appointments may not work. Payment transactions may not work correctly. "

However, overcoming this problem is technically not too difficult. We only need to transfer software and hardware systems to a higher platform, such as 64 bit, to extend the end time. Over the past few years, many personal computers have switched to 64-bit platforms. Many software companies have also shifted because of job characteristics. For example, banks need to work with mortgage loans for more than 30 years.

In fact, the 64-bit system not only helps us troubleshoot 2038, but the maximum time it can store up to 292 billion years - an impressive figure for our peace of mind. Moreover, computer experts have 24 years to fix this potential problem.