I am working on a multi-user project in Java and I am trying to clarify how I collect latency measurement results.
My current setup is to send a packet of UDP packets at regular intervals, which the server receives timestamped and returns, then the delay is calculated and recorded. I take the number of samples and then calculate the average to get the delay.
Does this sound like a smart solution for generating client-side latency?
source
share