I am reading information about mix / max values ββof JavaScript date objects in various implementations.
Mozilla docs says that JavaScript supports " - 100,000,000 days to +100,000,000 on both sides" of the UNIX era. If my math is correct, it should be 8.64e15 ms on each side.
Microsoft MSDN says that JScript supports " approximately 285,616 years on both sides of the" UNIX era ".
Unit tests for Google v8 show +/- 1e8 days from the era.
ECMAScript 5.1 points out a bit more clearly :
Time is measured in ECMAScript in milliseconds since January 01, 1970. UNIVERSAL GLOBAL TIME. In time values, jump seconds are ignored. It is estimated that there are exactly 86.4 million milliseconds per day. Number ECMAScript values ββcan represent all integers from -9,007,199,254,740,992 to 9.007.199.254.740.992; this range is sufficient to measure time to millisecond accuracy for any moment that is within approximately 285,616 years, either forward or backward, from January 01, 1970 UTC.
The actual time range supported by ECMAScript Date objects is slightly less: exactly -100,000,000 days to 100,000,000 days measured relative to midnight in early January - 1970. UNIVERSAL GLOBAL TIME. This gives a range of 8,640,000,000,000,000 milliseconds both sides of January 01, 1970 UTC
I am curious if anyone knows of any implementation that does not really support this range of β+/- 1e8 days from the eraβ?