قالب وردپرس درنا توس
Home / Tips and Tricks / What was the Y2K bug and why did it scare the world?

What was the Y2K bug and why did it scare the world?



  A 90s desktop PC.
Vladimir Sukhachev / Shutterstock

Billions of dollars have been spent addressing the Y2K bug. Government, military and corporate systems were all at risk, but we survived more or less unscathed. So, was the threat even real?

How We Planted Our Own Time Bomb

In the 1

950s and 1960s, representing two-digit years became the norm. One reason for this was to save space. The earliest computers had a small storage capacity and only a fraction of the RAM of modern machines. Programs had to be as compact and efficient as possible. Programs were read from punch cards, which had a clear finite width (typically 80 columns). You couldn't type past the end of the line on a punch card.

Where space could be saved, it was. An easy – and therefore common – trick was to store annual values ​​as two digits. For example, someone would enter 66 instead of 1966. Since the software treated all dates as in the 20th century, 66 was assumed to mean 1966.

Eventually the hardware capabilities improved. There were faster processors, more RAM and computer terminals replacing punched cards and tapes. Magnetic media, such as tapes and hard drives, were used to store data and programs. However, by then there was a large amount of existing data.

Computer technology continued, but the functions of the departments using these systems remained the same. Even when software was renewed or replaced, the data format remained unchanged. Software remained in use and expects two-digit years. As more data was collected, the problem got worse. The amount of data was enormous in some cases.

Another reason was to make the data format a sacred cow. All new software had to focus on the data, which was never converted to four-digit years.

Storage and memory limitations also arise in today's systems. Embedded systems, such as firmware in routers and firewalls, are of course limited by space limitations.

Programmable logic controllers (PLCs), automated machines, robot production lines and industrial control systems were all programmed to use a data representation. that was as compact as possible.

Shortening four digits to two is a significant space saver – it's a quick way to cut your storage needs in half. In addition, the more dates you have, the greater the benefit.

The Eventual Gotcha

  A date flipboard with the year 2000.
gazanfer / Shutterstock

If you only use two digits for annual values, you cannot distinguish between dates in different centuries. The software is written to treat all dates as if they were in the 20th century. This gives false results when you reach the next century. The year 2000 would be stored as 00. Therefore, the program would interpret it as 1900, 2015 would be treated as 1915, etc.

At midnight on December 31, 1999, any computer – and any device with a microprocessor and embedded software – storing and processing data as two digits would address this problem. Perhaps the software would accept the wrong date and continue, generating waste. Or maybe it would cause an error and continue – or, choke and crash completely.

This did not only apply to mainframes, minicomputers, networks and desktops. Microprocessors ran in planes, factories, power stations, missile control systems and communication satellites. Virtually everything that was automated, electronic, or configurable contained a code. The scale of the problem was enormous.

What would happen if all these systems flashed from 1999 one second to the next 1900?

Usually, some quarters predicted the end of days and the fall of society. In scenes that will resonate with many in the current pandemic, some went on to build essential supplies. Others called the whole thing a hoax, but it was undeniably big news. It became known as the "Millennium", "Year 2000" and "Y2K" bug.

There were other secondary concerns. The year 2000 was a leap year, and many computers – even leap year savvy systems – didn't take this into account. If a year is divisible by four, it is a leap year; if it is divisible by 100, it is not.

According to another (not so well known) rule, if a year is divisible by 400, it is a leap year. Much of the software written had not applied the last line. Therefore, it would not recognize the year 2000 as a leap year. As a result, performance on February 29, 2000 was unpredictable.

In President Bill Clinton's Bill of State of the Union in 1999, he said:

"We need every state and every local government, every company, big and small, to work with us to ensure that the Y2K computer bug [the] will be remembered as the last headache of the 20th century, not the first crisis of the 21st century. "

Last October, Clinton had the Year 2000 Law on Information and Readiness.

This Will Take Some Time

Long before 1999, governments and companies worldwide had worked hard to find solutions and implement workarounds for Y2K.

At first it seemed that the simplest solution was to expand the date or year field with two more digits, adding 1900 to each year and ta-da! You then had four-digit years. Your old data would be kept correctly and new data would fit together nicely.

Unfortunately, in many cases this solution was not possible due to the cost, the alleged data risk and the enormous size of the task. Where possible, it was best. Your systems would be date-safe up to 9999.

Of course, this only corrected the data. Software also had to be converted to process, calculate, store and display four-digit years. Some creative solutions appeared that have removed the need to increase storage for years. Monthly values ​​cannot exceed 12, but two digits can contain values ​​up to 99. So you can use the monthly value as a flag.

You can use a schedule as follows:

  • Add 1 to 12 years 1900 to the annual value for a month.
  • For a month between 41 and 52 add 2000 to the annual value and then subtract 40 from the month.
  • Add a month between 21 and 32 at 1800 to the annual value and then subtract 20 from the month.

Of course, you had to modify the programs to encrypt and decrypt the slightly obscured dates. The logic in the data verification routines also had to be tweaked to accept crazy values ​​(like 44 for a month). Other schedules used variations on this approach. Encoding the dates as 14-bit, binary numbers and storing the integer representations in the date fields was a similar bit-level approach.

Another system that reuses the six digits used to store dates completely stripped of months. Instead of saving MMDDYY they switched to a DDDCYY format:

  • DDD: The day of the year (1 to 365, or 366 for leap years)
  • C: A flag representing the century.
  • YY: The year.

Work-arounds were also plentiful. One method was to choose a year as the central year. If all your existing data were newer than 1921, you could use 1920 as the pivot year. All dates between 00 and 20 were considered 2000 to 2020. Anything from 21 to 99 meant 1921 to 1999.

These were, of course, short term repairs. It took you a few decades to implement a real solution or to migrate to a newer system. Yes, right! Unfortunately, society doesn't do that much – just look at all the COBOL applications that are still widely used.

RELATED: What is COBOL and why do so many institutions trust it?

Y2K compatible? Prove it!

Repairing internal systems was one thing. Fixing code and distributing patches to all customers' devices in the field was another matter entirely. And what about software development tools, such as software libraries? Had they compromised your product? Have you used development partners or suppliers for any part of the code in your product? Was their code safe and Y2K compatible? Who was responsible if a customer or customer had a problem?

Companies were in the middle of a paper storm. Companies fell over themselves and requested legally binding declarations of conformity from software vendors and development partners. They wanted to see your overarching Y2K preparedness plan and your system specific Y2K code review and recovery reports.

They also wanted a statement that your code was Y2K-safe, and that in case something bad happened on or after Jan 1, 2000, you would accept responsibility and they would be lifted.

In 1999 I worked as a Development Manager for a UK based software house. We have created products that are linked to business telephone systems. Our products provided the automatic call handling and professional call centers rely on daily. Our customers were major players in this field, including BT, Nortel and Avaya. They sold our rebadged products to untold numbers of their customers around the world.

On the back of these giants, our software ran in 97 different countries. Due to different time zones, the software would also go through midnight more than 30 times on New Year's Eve 1999! Needless to say, these market leaders felt somewhat exposed. They wanted hard evidence that our code was compliant. They also wanted to know that the methodology of our code reviews and test suites was good and that the test results were repeatable. We went through the ironer, but got through it with a clean health certificate. This of course took time and money. Although our code was compliant, we had to withstand the financial blow to prove it.

Still, we came out lighter than most. Total global costs for Y2K preparation were estimated by Gartner at $ 300 to $ 600 billion and Capgemini at $ 825 billion. The US alone has spent more than $ 100 billion. It has also been calculated that thousands of man-years were spent addressing the Y2K bug.

The Millennium Dawns

  A commercial airplane in the sky.
Lukas Gojda / Shutterstock

Nothing beats the money where your mouth is. On New Year's Eve 1999, John Koskinen, President of the President & # 39; s Council on Year 2000 Conversion, boarded a flight that would be on the air at midnight. Koskinen wanted to show the public that he believed in the hugely expensive, multi-year cleanup needed to get the American Millennium ready. He landed safely.

It's easy for non-techies to look back and think that the millennium bug was overdone, overhyped, and just a way for people to make money. Nothing happened? So what was it about?

Imagine there is a dam in the mountains holding back a lake. Downstairs is a village. A shepherd announces to the village that he has seen cracks in the dam and will not last more than a year. A plan is being drawn up and work is beginning to stabilize the dam. Construction work has finally been completed and the predicted failure date is without incident.

Some villagers might mumble that they knew nothing was wrong, and look, nothing happened. It's as if they have a blind spot for the time the threat was identified, addressed and eliminated.

The Shepherd's Y2K equivalent was Peter de Jager, the man who brought the issue to the public consciousness in 1993. article from Computerworld magazine. He continued to campaign until it was taken seriously.

When the new millennium dawned, the Hunter was also on a flight from Chicago to London. And also, just like Koskinen's, De Jager's flight arrived safely and without incident.

What happened?

Despite the enormous efforts to prevent Y2K from affecting computer systems, there were cases that slipped through the net. The situation in which the world would have ended up without a net would have been unthinkable.

Aircraft did not drop from the sky and nuclear missiles did not launch themselves, despite doom-monger predictions. Although the personnel of an American tracking station received a slight frisson when they observed the launch of three missiles from Russia.

However, this was a man-ordered launch of three SCUD missiles as the Russian-Chechen dispute continued to escalate. However, it raised eyebrows and heartbeat.

Here are some other incidents that have occurred:

The Legacy: 20 Years Later

Do you remember those spillions we mentioned? They were the work-around that people and companies bought for a few decades to implement a real solution for Y2K. Some systems still rely on this workaround and are still in use today. We have already seen some failures during use.

Early this year, parking meters in New York no longer accepted credit card payments. This was attributed to the fact that they reached the upper limits of their pivot year. All 14,000 parking meters had to be visited and updated individually.

In other words, the big time bomb caused many small time bombs.


Source link