How to calculate the delay

I created some Java code that measures latency, packet size, and throughput.

What is the equation needed to calculate latency?

This is what I am currently using, but not sure if this is correct:

//latency = packetsize / delay + bandwidth System.out.println("latency is " + (len*2) / (duration + transferRateMb)); 

CHANGE length is multiplied by 2 to give the correct value in bytes

duration is the time it takes to complete ping

transfer rate is determined by:

 double transferRateMb = ((len*524288.0) / (duration/ 1000000000.0) ) ; //amount of data in megabytes transferred in 1 second. 

I read various manuals and do not understand them, this is just a requirement for the project

it will also be executed 20 times, and the average value

any ideas?

+6
source share
1 answer

I always measure the delay. The average delay is usually not very interesting, that you really need to know how bad latency can be, for example. 1 in 100, 1 in 1000 or worse. To do this, you need to measure each individual delay and try it (or keep the maximum). It will be much higher than your calculation, for example. it can be easily 10x

In short, you cannot calculate the delay you need to take care of; you must measure it.

0
source

All Articles