I am trying to measure the duration of a period with both System.nanoTime() and System.currentTimeMillis() . And the longer the period, the greater the difference is obtained from two dimensions.
Here is a small snippet demonstrating the problem:
public class TimeTest { public static void main(String[] args) throws Exception { long startNanos = System.nanoTime(); long startMillis = System.currentTimeMillis(); while (true) { long nowNanos = System.nanoTime(); long nowMillis = System.currentTimeMillis(); System.out.println((nowMillis - startMillis) - (nowNanos - startNanos) / 1000000); Thread.sleep(100); } } }
When working on Mac OS with jdk 1.8.0_74, there is a clear tendency that the values ββdecrease by about 2 ms per minute. That is, at first I see only 0 and 1, but after 10 minutes there are values ββaround -20.
I managed to observe this behavior only on mac using jdk8, I could not play it on linux and on windows with jdk 7 and 8.
So the question is: who is lying? I know that nanoTime() should be chosen to measure duration. But in this case, I'm not sure if this is true.
Can someone clarify this topic?
source share