It depends on various aspects. When using the standard "seconds from the era", and someone uses only the whole precision, their dates are limited to the range of 1970-2038.
But there is a problem with accuracy. For example, unix time ignores seconds of a jump. Each day is defined as an equal number of seconds. Therefore, when calculating the time deltas between unix time, you get some error.
But all the more important is the fact that you assume that all your dates will be fully known , since your idea does not have the ability to use half of only half of the given dates. In fact, there are many events that you do not know to the nearest second (or even ms). Thus, this is a function if the view allows you to specify, for example, only the accuracy of the day. Ideally, you will keep dates with their accurate information.
Also, let's say you create a calendar application. There is time, but there is also local time. Quite often, you need both available information. When planning your floors, you can of course do it best in synchronized time, so it will be fine here. If you, however, also want you to not plan events outside of 9-20 hours local time, you also always need to save time zone information . For anything that spans more than one place, you really need to include the time zone in your date view. Assuming you can just convert all the dates that you see in the current local time, are pretty naive.
Note that dates in SQL can lead to odd situations. One of my favorites is the following absurdity of MySQL :
SELECT * FROM Dates WHERE date IS NULL AND date IS NOT NULL;
can return records with a date of 0000-00-00 00:00:00 , although this violates the popular understanding of logic.
Anony-mousse
source share