Probably a dumb question. I notice the difference at runtime when you run a simple program Hello Worldto Ca Linux machine (this is not a language, although it).
Program:
#include<stdio.h>
#include<time.h>
int main()
{
clock_t begin, end;
double time_spent;
begin = clock();
printf("%s", "Hello World\n");
end = clock();
time_spent = (double)(end - begin) / CLOCKS_PER_SEC;
printf("%f\n", time_spent);
return 0;
}
o / r
$ ./hello
Hello World
0.000061
$ ./hello
Hello World
0.000057
$ ./hello
Hello World
0.000099
This is tested on a quad-core machine with an average load of 0.4 and ample free memory. Although the difference is quite small, what could be the reason for this?
source
share