The problem Y2K embodied was that years were being stored as 2 digits on many software systems. 1999 was abbreviated as 99. When the year flipped to 2000, many were not sure what would happen with their computer systems. As a result, many dates were changed to 4-digits, or systems were re-architected to handle the transition in other ways.
People who were in the IT industry in the year 2000 know that Y2K was a significant problem. The fact that it did not materialize to much of anything was due to our diligence and the solutions put in place.
Now that we're approaching 2010 and double digits can once more be used to designate the year in a reasonably clear fashion, is it safe for new projects to utilize double digit years?
The main reason years were stored as two digits was that they took up less storage space. Space is cheap nowadays, so this is a far less compelling argument. However, for the rare date-oriented embedded or mobile app, or enormously large databases with many date fields, this may still be a valid consideration.
Y2.1K Is A Looong Way Off
The year 2100 would effectively be another Y2K for any system that utilized 2-digit years. But considering how fast technology changes, I can't imagine any data created now persisting for 90 years. Who knows what the face of technology will be then? Maybe we'll be storing our data in laser-stimulated protein chains by then.
Looking at it from the other side, some programs that were problematic during Y2K were 30+ years old and still in use, and it was somewhat difficult to find enough skilled mainframe programmers to repair them. If you develop a system in the year 2050, who's to say it won't be in use 50 years later?
Reduced Internet Traffic
There is a recent movement for URL-shortening. Shorter URLs mean very high traffic sites can save huge amounts of bandwidth by not having to transmit full URLs in HTTP headers or in content pages. Using 2-digit dates would reduce some overall internet traffic.
If your current system utilizes 4-digit dates, it is doubtful that you have a compelling reason to re-architect your system to use 2-digit dates. Such a move would be costly and perhaps completely unworkable.
Many libraries (if not most) use a 4-digit year. If you switched to 2-digits, you may have to upconvert in your code to utilize those libraries, or create custom libraries, creating another point of failure for your software. If you use date math heavily, a 2-digit date is probably not for you.
Repeating Mistakes Of The Past
Whether or not there are compelling arguments for either side of the discussion, there is always the nebulous "we did it before and look what happened" factor to consider. Generally speaking, abbreviating data that does not need to be abbreviated is rarely a good idea.
In general it is probably a poor idea to utilze 2-digit years. However it is possible that there may be rare valid applications for its use, but even that should probably be limited to greenfield projects.
I have not seen this topic elsewhere so I thought it may make for some interesting discussion. What do you think?