In my experience, I have not come across a more "established" standard than the UNIX era. However, some architectural / technological aspects of time storage were discussed earlier: Timestamps and time zones in Java and MySQL
I would ask why you risk using your own convention? This is a risk, because: that if for some time you want to add hours to your account of the day, you may be able to order people based on when they were born during the day. The question can be extended to: what if at some point you want to measure more general or finer-grained moments; you will have to translate your entire function, possibly to many levels of your application, into a more general mechanism / agreement. Another (similar) question: will you always measure once-in-a-lifetime events for people in your database or will they be able to create new, unlimited events? As the number of events increases, the risk of collision increases, and the number of days will not be as suitable as the timestamp measured in seconds or milliseconds.
UNIX time is mostly ubiquitous; you have special methods for getting it in most programming languages. The architecture for saving time, which I will always support and implement in my projects, is as follows: http://www.currentmillis.com/tutorials/system-currentTimeMillis.html

As also pointed out in my answer to the question mentioned above, the advantages of storing time in milliseconds since the UNIX era are:
- Clarity of architecture: the server side works with UTC, on the client side time through the local time zone
- database simplicity: you save a number (milliseconds), not complex data structures like DateTimes
- programming efficiency: in most programming languages ββyou have date / time objects that can take milliseconds from the era of the Epoch when building (which allows you to automatically convert to the time zone on the client side)
Because you mentioned C #, DateTime.MinValue comes to mind. It will be mostly year 0 (midnight, January 1).
In addition, it will be some kind of code that will allow you to get millions from your chosen key date (whatever that is), but note that 1900 is still different from the .NET βeraβ (DateTime.MinValue)
// Unix Epoch (DateTime.UtcNow - new DateTime (1970, 1, 1)).TotalMilliseconds // NTP Epoch (DateTime.UtcNow - new DateTime (1900, 1, 1)).TotalMilliseconds
source share