Check documents. This is not a Y2K problem - this is the lack of a problem in 2000! This decision was originally made in C and was copied to Perl, obviously JavaScript and possibly several other languages. Once upon a time, it seemed that it would be desirable to use double-digit years, but, great, whoever developed this interface, there was not enough forethought to understand that they need to think about what will happen in 2000 and beyond, therefore instead in order to simply provide the last two digits, they provided the number of years since 1900. You could use two numbers if you were in a hurry or want to be risky. Or, if you want your program to continue to work, you can add 100 to the result and use the full four-digit years.
I remember the first time I did manipulations with Perl. Oddly enough, I read the documents . This is apparently not an ordinary thing. After a year or two, I was called to the office on December 31, 1999 to correct a mistake that was discovered at the last minute in some kind of Perl contract code in which I never had anything to do. It was this exact problem: the standard call by date returned from 1900, and programmers viewed it as a two-digit year. (They assumed that they would receive “00” in 2000.) As a young inexperienced programmer, it seemed to me that we paid the same amount for “professional” work, and these people did not even bother to read the documentation. This was the beginning of many years of disappointment; now i am old and cynical. :)
In 2000, the annual YAPC Perl conference was named "YAPC 19100" after this often-asked non-error.
Currently, at least in the Perl world, it makes sense to use a standard date processing module that uses real four-digit years. Not sure what might be available for JavaScript.
skiphoppy Sep 19 '08 at 0:42 2008-09-19 00:42
source share