View Single Post
 
Old 03-19-2011, 10:04 AM
Pturner's Avatar
Pturner Pturner is offline
Sage
Join Date: Oct 2009
Location: Atlanta
Posts: 7,064
Thanks: 0
Thanked 1 Time in 1 Post
Default

Quote:
Originally Posted by Golf-Tinker View Post
The Y2K issue was valid only for systems using a 2 digit year representation. Before the year 2000, large systems (mostly COBOL) used 98 to represent the year 1998 and so on. Most systems were designed keeping this in mind, and hence when the year 2000 arrived, the year 00 would thus become 1900 – throwing all financial calculations out of the window.

There were two solutions.
Use a 4 digit year (which was not practical without a major software rewrite) or change the assumptions:
Anything greater than “50” would mean we are in the 20th century (i.e. 50 is 1950, 60 is 1960 and so on)
Anything less than “50” is the 21st century (i.e. 00 is 2000, 10 is 2010 and so on). Many companies chose the later solution when faced with what would mean major costs to do a complete systems rewrite.

Those who used 4 digit years (option #1) are covered, and will continue to remain covered till we roll over to 5 digit years, and if the systems built today are still running. However, for those using option #2, the ghost of the Y2K is not yet gone. In the year 2049, the ghost will come back to haunt everyone.
So, why the heck would the electronic devices know or care whether 00 meant (to us) 1900, 2000 or 600?
"How would they know, why would they care?" (Please don't bother trying to fix her. She's not there.)
All my stuff used the two-digit year and didn't blink an eye when 99 became 00.

Of course I could be wrong. Fortunately for me though, all my old hardware, software and other electronic devices took my side in the matter.