Why is there a difference in runtime while running the same program?

Probably a dumb question. I notice the difference at runtime when you run a simple program Hello Worldto Ca Linux machine (this is not a language, although it).

Program:

#include<stdio.h>
#include<time.h>

int main()
{
    clock_t begin, end;
    double time_spent;

    begin = clock();

    printf("%s", "Hello World\n");
    end = clock();
    time_spent = (double)(end - begin) / CLOCKS_PER_SEC;
    printf("%f\n", time_spent);
    return 0;
}

o / r

$ ./hello 
Hello World
0.000061
$ ./hello 
Hello World
0.000057
$ ./hello 
Hello World
0.000099 

This is tested on a quad-core machine with an average load of 0.4 and ample free memory. Although the difference is quite small, what could be the reason for this?

+4
source share
3 answers

, . , -, ..

0,04 .

, .

+6

: .

, "stuff": ; ; ; ! , , . 1000 . , , , ... , :

unsigned i, j;
...
// Wait a LONG time!
for (i=0;i<5u;++i) { // 5 is about a minute on my machine
    for (j=0;j<~0u;++j) {
        // Twiddle thumbs!
    } // for
} // for

.

+1

:

  • : , , .

  • : (, , ).

0

All Articles