azurelunatic: Vivid pink Alaskan wild rose. (Default)
Azure Jane Lunatic (Azz) 🌺 ([personal profile] azurelunatic) wrote2003-04-02 04:26 pm

Development paper (hacked out beginning)



Development: Y2K
<name>




"The Y2K problem" was a household word for a while. Almost everybody who knew about computers knew that computers were going to do something bad when the year 2000 rolled around in their interal clocks. Why? Because someone, somewhere, in their infinite wisdom, had decided that the year could and should be represented by the final two digits, instead of the full four digits needed to represent the difference between 1800, 1900, 2000, 2100...

One of the early warnings of the issue came in the 1950’s, when IBM worked with a group from the Church of Latter-Day Saints on a genealogy program. When they discovered that the computer was unable to distinguish between 1800 and 1700, or 1850 and 1950, it was a setback to the genealogy, which programmer Bob Bemer found a way to work around, but Bemer realized that this “could become a crisis in all sorts of applications at the turn of the century” (Williamson).

The most typical excuse for this that I have heard has been "saving space", citing the legendary programmer penchant for wishing to do things in the quickest, easiest fashion possible. You can, after all, save a whopping 1,024 bytes in a 512-record database by storing the year as two digits instead of four. "It would be just like programmers to shorten 'the year 2000 problem' to 'Y2K'-- exactly the kind of thinking that created this situation in the first place." (Meyer)

Other reasons include the "We've always done it this way" attitude. My father recalled mixing it up with a fellow programmer about date storage. He wanted to go with the four-digit year. His colleague (a renowned name in the study of the aurora borealis) didn't see why they couldn't just go with the two-digit year, because they'd always done it that way, and it had worked out fine.

Why had it always happened that way? In 1959, the computer language COBOL was released, with a two-byte space for the year (Williamson). This quickly became a standard language, and found its way into a lot of very big systems. Early sightings of the future problem were shot down. A programmer named White, who worked with the Department of Defense in the 1960's, commented, "'We worked to establish a four-digit standard in the DOD. ...The only thing they would accept was two digits. They were not flexible... .'"

Some programmers were told that surely any system that they were writing now would be replaced by the time the two-byte year became a problem (Williamson). But in 1969, programs working with "long-term financial instruments" such as mortgages began having problems (Downey). By the 1990's, everyone knew about the problem with the two-byte year, though the problems it would cause were debated. People guessed at effects ranging from a few wrong numbers on bills to widespread technology failure and a return to the Dark Ages (Williamson).

A developer is not to anticipate that a system will be replaced in time to cure the time bombs contained within through the power of obsolescence. In order to write software that will make the end-users not curse your name, fifty years down the road, the developer should try to anticipate some of the issues that the system will face after many years. If the system should still be in use in the year 9999, will it be feasible to switch to the five-byte year? If a solution to a distant future problem is not readily forthcoming, it would be silly to throw time, money, and people at formulating an immediate solution, but it would be wise to leave warnings in the documentation and source code, to better untangle yesterday’s coding standards by the time they are today’s problems to circumvent tomorrow’s disaster.



References:

Downey, Greg. (Date) Y2K in Social Context: Timeline. Retrieved April 02, 2003 from the World Wide Web: http://www.journalism.wisc.edu/~downey/classes/y2k/timeline.html

Meyer, Steven C. Y2K. In Geek Quotes compiled by ...? (date) Retrieved April 02, 2003 from the World Wide Web: http://nand.net/~demaria/geek_quotes.txt

Williamson, Doug. (Friday, June 25, 1999) "Y2K: Computer glitch came as no surprise." Naples Daily News
Retrieved April 02, 2003 from the World Wide Web: http://www.bobbemer.com/SCRIPPS.HTM





APA format: http://webster.commnet.edu/apa/apa_index.htm
http://webster.commnet.edu/apa/apa_index.htm

good

[identity profile] popefelix.livejournal.com 2003-04-02 03:28 pm (UTC)(link)
one thing: s/"two-bit"/"two-byte"/g and s/"four-bit"/"four-byte"/g

A two bit year only goes up to 3. :)

[identity profile] boojum.livejournal.com 2003-04-02 03:47 pm (UTC)(link)
It's not just Cobol that started the two-digit concept. A lot of checks and other forms where you have to put dates used to put 19__ for the year, so you only had to write the relevant digits. There's also the 4/2/03 date format. I suspect "humans are lazy writers" is actually a bigger reason than "humans are lazy coders" for the beginning of the concept; then the lazy coders just perpetuated it. (And, to be fair, if everyone else is doing it wrong, it's often better than to be consistent than to be right.)

[identity profile] godai.livejournal.com 2003-04-02 05:27 pm (UTC)(link)
The funny thing is alot of the programs won't make it to 9999.

Because of the 2038 problem. It stems from the time library in alot of systems
work on saving the time from an "epoch" date like jan 1st 1970.


So it stores the number of seconds from the epoch.

In 2038, the number will hit max and flip to negative.

http://www.neosoft.com/~scholars/ws/2038.htm

and
http://www.howstuffworks.com/question75.htm

heh. such fun. :)