I want to calculate the runtime of my Java program. when i used the following method:
long startTime = System.currentTimeMillis();
...my program...
long endTime = System.currentTimeMillis();
long totalTime = endTime - startTime;
System.out.println(totalTime);
the result is different from the time, so I'm going to get some (100 or some) samples and calculate the average value to make it more accurate:
long total = 0
for (int i = 0; i < 100; i++) {
Long s = System.currentTimeMillis();
...my program...
Long e = System.currentTimeMillis();
total += (e - s);
}
so I got the total runtime .. for the first method it is usually about 600 ms, and for the second method the result will be about 30,000 ms (so avg time: 30000/100 = 300 ms).
I think this may be due to the JVM, it takes some time to load a file or convert it to byte code. and the second method only load & convert once and just run the method 100 times?
I am not sure of my explanations, please correct me if I am wrong.
: "" , ?
!