Originally Posted by Pturner
(Post 240786)
Hi neighbor,
As you said, this is a hijack, but, as you are an expert, I can't help asking your more about Y2K. I thought it was a hoaks too, for a few reasons:
- In my unknowledgeable "logic," I thought code was just a bunch of "1"s and "0"s to a computer. I didn't think computers knew or cared wether one number (1999) was higher or lower than another number (2000).
- I couldn't understand why a computer would be be confused about 2000 following 1999, but not 20 following 19.
You are correct about the "1"s and "0"s. But that is internal to the process. Back in the "old days", cards were used to input data. Because a card was limited to 80 characters, only the last two years of the date were used to indicate the date. So if you were born in 1950, the date entered was 50. As long as no one was born in 2000 and above, this was no problem. The dates always sorted upward or downward. Date calculations were fine. Once we reached 2000, people born in 2000 and above would be at the wrong end of the sort list and really mess up a calculation formula. As an aside - because I started programming in 1965, I actually had problems will people born in the 1800's but they were special programs (sales leads for those that understand this) so I could program around it. Using the 00 as 2000 for birth, many computer programs would do a calculation and come up with either a negative number or a number out of range with any rate tables, etc. Without changing the programs or increasing the date field to 4 digits - which is what most applications did, everything would have been screwed up. This is probably not the best explanation but I can do better on a PM. Also I don't want to be arrested by the hijack police.
- As I recall, April x, 2000 (I forget the exact date) something also was supposed to happen because the digits for that date meant something else somewhere else in the computer code. But I couldn't understand how one line of code governed by one set of coded paramerters would confuse another line of code governed by a different set of coded perameters.
Don't remember that date as any problem but then that was 10 years ago and I have a hard time remembering last week.
- Convinced there was nothing to it, I did nothing to my home computer, nothing to my digital clocks and nothing to my appliances that had digital clocks. I didn't even change the date display on my home PC to show the 4 digit vs. 2 digit year. Come Jan. 1, 2000 all my home stuff worked the same as it had on Dec. 31, 1999 and before. Excel '97 (I think?) worked flawlessly, although I did not upgrade or run any program to correct it. Quicken '98 worked flawlessly, although I bravely and defiantly didn't upgrade or run any program to correct it. (Ok, not that brave and defiant-- I always backed it up.) Outlook, Word, and all my other non-y2k compliant programs worked flawlessly
Unless you worked internally with folders and files using Windows Explorer, you probably would have not noticed any problem. Based on what I know and see, any application written for the PC uses the full date (2000 instead of 00). In EXCEL, even when the display is 09, the hidden data is 2009.
- Whatever that date was in April-- I think there was one in February too?-- that something was supposed to happen to my non-y2k compliant hardware, again, nothing did.
Again - not aware of a February date.
So, zcaveman, any explanation you can provide demonstrating it wasn't a hoax, I would like. I realize I don't know what I don't know, and I enjoy trying to expand what I do know. Thanks.
|