How is the virtual machine’s uptime calculated compared to how the system time is calculated? I know that I can get the current AVM runtime by calling:
getTimer();
And I can get the current unix system time by doing:
new Date().getTime();
I know that each class Timer and Event.ENTER_FRAME have their ups and downs, but I decided that the 2 values that I compared should remain unchanged relative to each other. This is how I test it:
private var _appRunTime:int; private var _appStartTime:int; private var _systemTime:int; private var _systemCurrentTime:int;
I do not understand why these numbers slowly fail. Using this code, I found, at least on my computer, that the values grow separately by about 3 milliseconds per minute, with the value coming from the system time being a higher value, and the value coming from the AVM time, is below.
Can someone offer me an explanation of what they are designed for and why there is such a small but growing gap in their values over time?
source share