Why do time intervals totalmilliseconds have the same numbers in integer and decimal parts?

When calculating the difference in milliseconds between two DateTime objects, I always get the return number, where the decimal part of the number matches the integer component of the number. For example: 1235.1235

Why is this happening? Am I doing something wrong? Is this a quirk of the language or a limitation of the DateTime granularity or something else?

This can be demonstrated with the following code:

  DateTime then = DateTime.Now; Thread.Sleep(1234); DateTime now = DateTime.Now; TimeSpan taken = now - then; string result = taken.TotalMilliseconds.ToString(CultureInfo.InvariantCulture); //result = "1235.1235" 

As CodesInChaos commented:

DateTime` does not match this level of precision: see C # DateTime.Now precision

However - this does not completely explain this behavior.

+8
c # datetime timespan
source share
1 answer

There is a technical explanation for this; I cannot prove otherwise that this explains your observation. He, of course, does not reproduce my car.

To get started, you look at the numbers with noise, the operating system clock is not accurate enough to give you accuracy in milliseconds. Therefore, make sure that you never rely on your value to do anything important. If you want to measure the interval with high resolution, you should use a stopwatch instead.

The operating system clock is affected by updates from the time server. Most machines are configured to periodically contact time.windows.com to recalibrate the clock. This eliminates the drift of the clock, the hardware is usually not good enough for the time to be more accurate than a second in a month. Low-dose crystals are expensive and never completely drift-free due to temperature and aging effects. And a second jump is inserted once in a while to synchronize the clock with the decelerating rotation of the planet. The last one crashed a lot of Linux machines, google "Linux leap second bug" for some fun.

It is important here what happens when your computer receives a new update that requires it to be configured for the clock. Windows does not make a sudden jump in the value of the clock, which causes serious problems with programs that pay attention to the clock and expect them to consistently increase the predicted amounts.

Instead, it adds a beat at a time with each increment of the measure. In fact, the watch runs a little slower or faster, so it will gradually make up for the difference and again become more accurate. Perhaps you can see where this happens, the extra microseconds added are proportional to the length of the interval. Thus, viewing the interval repeated in noise figures is believable.

The only real way to prove this theory is to derive GetSystemTimeAdjustment (). It will return non-zero values ​​when executing system time. And then pinvoke SetSystemTimeAdjustment () to turn it off and watch if it matters in the value you see. Or just wait long enough for the clock to catch, so it no longer adjusts.

+4
source share

All Articles