Qtimer vs timerEvent - which one creates less overhead?

I need a timer that will start every 1 ms. This report reports that calling a slot can be much slower than even calling a virtual function.

But if we compare signals / slots with an event, which mechanism will be faster, more efficient, and there will be less overhead: QTimer with it timeout() signal connected to the slot or a bare QObject::startTimer() \ QObject::killTimer() with QObject::timerEvent() ?

Will the answer to the above question be the same for Windows and Linux?

+7
source share
2 answers

QTimer is actually just a signal slot wrapper around QObject::startTimer() functionality, so it will undoubtedly have more overhead associated with it on all platforms (it internally implements QObject::timerEvent() - its implementation of this function just to emit a timeout() ) signal.

It should be noted that QBasicTimer is a lighter wrapper around the functionality of QObject::startTimer() . If you use QBasicTimer , you still need to implement QObject::timerEvent() , but it manages the timer id for you. Thus, QBasicTimer combines some of the ease of use of QTimer with the efficiency of using the QObject::startTimer() mechanism.

+8
source

In fact, if you need accuracy, QT does not guarantee that your timer will execute in exactly 1 ms.

At least until QT 4.7.X, if QT has an event (processed inside the event loop), all timers are checked for expiration (and then raise their signals) "in the event loop". I mean that they will not be executed as an OS event that interrupts other tasks, etc.

What you can get is a timer that runs after 1.5 seconds if any other 3 events in your loop need, for example, 0.5 seconds each.

I hope my memory does not fail, I looked at the QT timer code a few months ago, and now I can’t remember whether the timer events are processed after other events or earlier.

Hope this helps you a bit more.

+5
source

All Articles