The confusion of the unit time getTickCount

In response to a question on Stack and in the book on here, on page 52, I found that the usual combination of getTickFrequency getTickCount for measuring runtime gives time in milliseconds. However, the OpenCV site talks about time in seconds. I'm confused. Please, help...

+4
source share
2 answers

There is no room for confusion; all the links you have indicated point to the same thing.

getTickCount gives you the number of clock cycles after a certain event, for example, after turning on the machine.

 A = getTickCount() // A = no. of clock cycles from beginning, say 100 process(image) // do whatever process you want B = getTickCount() // B = no. of clock cycles from beginning, say 150 C = B - A // C = no. of clock cycles for processing, 150-100 = 50, // it is obvious, right? 

Now you want to know how many seconds these clock cycles. To do this, you want to know how many seconds a single measure takes, i.e. clock_time_period If you find this, just multiply by 50 to get the total time.

For this, OpenCV provides a second function, getTickFrequency() . It gives you frequency , i.e. number of measures per second . You take it back to get a period of time.

 time_period = 1/frequency. 

Now that you have the time_period of one measure, multiply it by 50 to get the total time in seconds.

Now read all the links that you specified again, you will receive them.

+20
source
 dwStartTimer=GetTickCount(); dwEndTimer=GetTickCount(); while((dwEndTimer-dwStartTimer)<wDelay)//delay is 5000 milli seconds { Sleep(200); dwEndTimer=GetTickCount(); if (PeekMessage (&uMsg, NULL, 0, 0, PM_REMOVE) > 0) { TranslateMessage (&uMsg); DispatchMessage (&uMsg); } } 
-1
source

All Articles