The best way to count how many times per second a method is called

I have a dll method that should be "QoSed" - this method should be called a maximum of 100 times per second .:

private static extern int ExecTrans(int connectionId); 

This method is used only in one place in the program, so it is well suited for this place. I need a separate qos counter for each connectionId . Therefore, ExecTrans(1) and ExecTrans(2) must go to different counters.

At first iteration, I would like to calculate how often the method is called (for each connectionId ). That is, I want to have "live statistics". There are two approaches:

 - allow to exceed limitiation for a short period. for example allow "100 transaction from 0 to 1 second, 100 transaction from 1 to 2 seconds and 200 transactions from 0.5 to 1.5 second". - at any second interval transactions should not exceed 100. 

At the moment, I donโ€™t care which of these methods to use, but I would choose one, creating less โ€œutilityโ€ ones. I want qos to add as little โ€œextra work" as possible, because it traded software that was sensitive to every 0.1 ms.

Regarding the first approach, I think I can use something like this (pseudo code, maybe stats and curStats should be thread safe):

 private int[] stats // statistic to display to user private int[] curStats; // statistic currently collection OnOneSecondElapsed(object source, ElapsedEventArgs args) { foreach (conId : connIds) { stats[conId] = curStats[conId]; curStats[conId] = 0; } } myMethod { ...... ExecTrans(conId); ++curStats[conId]; ...... } 

As for the second approach ... is it possible to create a collection where the life of objects is exactly one second and will disappear after the second? Then every time I add the following object to the collection if the collection does not contain 100 objects.

What do you think? I am not familiar with C # library files, so maybe I am missing a useful class, maybe you can offer a different approach.

+7
source share
4 answers

First approach:

  • Use ConcurrentQueue<DateTime>
  • Before each request, check the queue size. If> 100, cancel the request
  • If <100, enter the current DateTime and run the query
  • In the background thread, every 0.1 second, delete records older than 1 second

It should be quite effective, but:

  • Since there is no lock between checking the number of queues and the upload time, sometimes you can receive a little more than 100 requests per second.
  • Since the background thread runs every 0.1 seconds, if you receive 100 requests at the same time, it can block the queue for up to 1.1 seconds. Adjust your sleep time if necessary.

Maybe I'm wrong, but I do not think that there is an ideal solution. In principle, the more accurate the system, the greater the overhead. You must adjust the settings according to your needs.

+5
source

There is a tool called a profiler that does exactly what you are looking for. You run it with code, and it will tell you exactly how much time it spent on each method and how many times each method was called. Here is the old thread about C # profilers. If you are a professional developer, you already have a company profiler license.

+3
source

In some cases, you will receive more than n times per second, I assume that you simply do not want to do any actual processing in additional cases.

You can use a synchronized Q object to store transactions for each connection. Calling your method will simply give you information about what to do. In a separate processing stream (either one for the system or for each connection), you can then delete the operations and process them at a speed of 1 for 0.01 seconds. Just truncate the Q-size to 100 (to 100), before lowering each unit of work for a given connection and voila, you delete additional work items.

Note. To ensure the 1st transaction in 0.01 seconds, you will need the exact synchronization function. eg:.

 Stopwatch watch = new Stopwatch(); int nextPause = watch.Elapsed.Milliseconds + 10; while (true) { //do work (dequeue one item and process it) int now = watch.Elapsed.Milliseconds; if( now < nextPause ) { Thread.Sleep( nextPause - now ); } nextPause = watch.Elapsed.Milliseconds + 10; } 

Note. If a transaction takes more than 10 milliseconds (1 / 100th of a second), you can delete additional work items ...

If you want the workflow to be more "pop-up", you could process several work items in one cycle and use a longer wait time, which would require partial wait with a partial number of "items on the left." (It would also be better to use Monitor.Pulse and Montior.Wait instead of sleeping ...)

+1
source

In case someone should measure, not throttle ... here is a naive approach that gives a rough estimate:

 class A{ private int _calls; private Stopwatch _sw; public A(){ _calls = 0; _sw = new StopWatch(); _sw.Start(); } public void MethodToMeasure(){ //Do stuff _calls++; if(sw.ElapsedMilliseconds > 1000){ _sw.Stop(); //Save or print _calls here before it zeroed _calls = 0; _sw.Restart(); } } } 
0
source

All Articles