How does JavaScript reflect DateTime values โ€‹โ€‹inside?

I am working on a program that uses JavaScript for scripting. Like most script systems, there is a predefined library of native functions that scripts can call, and I just added a new one.

The new function accepts DateTime (it is in Delphi, where DateTime is represented inside Double), a string, and a boolean. The last two parameters go through a penalty, but it seems that somewhere in the system, the time is distorted. Instead of recognizing a DateTime, I get 1362394800000 , which makes no sense according to the Delphi timestamp scheme.

Where can I find information on how JavaScript represents DateTime values, so I can figure out how to translate this into something that my Delphi code can use? (This uses the Microsoft JScript system, which comes standard with Windows 7, in case it changes.)

+4
source share
3 answers

JavaScript represents Date objects as the number of milliseconds since an era. This is important because most other systems and languages โ€‹โ€‹use only whole seconds.

So, assuming Delphi is one of those second users, you should be able to divide the number by 1000 and pass it.

+5
source

It is stored in the form of several milliseconds from 1/1/1970 00: 00: 00.000

+1
source

JavaScript Date values โ€‹โ€‹are internally represented as milliseconds since unix epoch ; that the value you get with .getTime() distinguishes the object from the number.

+1
source

All Articles