Timer overhead in a C # application

How much overhead does the timer in the application cause if they constantly work in the background (regardless of the interval)?

I don’t care about the calls that the timer will make when it is ticking, but rather about the impact of performance on the use of timers in applications where performance is paramount, and I’m interested to know which ones exist.

+7
c #
source share
3 answers

A timer, between ticks, adds an extremely low cost to the app. It uses the OS engine to redistribute (which is active regardless of your actions), as opposed to the intuitive concept of polling the system clock.

Basically, adding the added memory and context switch (minor additions in this case. There shouldn't be more than adding a button to your form), there should be no additional overhead.

+8
source share

The event that is called by the timer will be executed in the same thread to which the timer belongs, and, therefore, it will block this thread when any logic is executed. This means that if Timer belongs to a GUI layer, executing the Timer.Tick method will block the GUI while it is running.

To maintain performance in the main thread, I suggest using BackgroundWorker instead, which runs its own thread in it.

0
source share

To answer the same: timers are priceless for gui programming, but pretty much useless for high-performance tasks. Some problems with timers:

  • they are not regular for a millisecond (there is actually nothing in the windows) - it is triggered when this time, but when all other messages (mouse and keyboard events, control updates) are processed, because they are serialized with other messages from / to gui
  • don't know .net implementations, but they are wasted handles in mfc

If you are considering another thread for any operation, make sure that you do not touch any gui component from it. Use either Invoke (), or copy the updates for gui to some queue, then unload it using a timer from the main gui thread.

0
source share

All Articles