Mysterious calculation error when multiplied by 100

The following script contains a very strange error. I want to check if the value is a positive integer. To do this, I multiply by 100 to enclose the value in decimal. If I test 0.07, the script did not calculate the value 7, but the value 7.00000001. I could round the value, but I would like to know why the value is calculated this way.

<script type="text/javascript"> var isPositiveInt = function(i) { i = i*100; return ((i % 1) == 0 && i >= 0); }; </script> <a href="#" onclick="alert(isPositiveInt('0.07')); return false;">Try it out!</a> 

0.05, 0.06 and 0.08 works great. But what happens with 0.07? I would be happy if someone explained this to me.

+4
source share
2 answers

This is because javascript throws everything on a double inside. As a result, all calculations cause some noise due to floating point inaccuracies : Examples of floating point inaccuracies

One way to fix this problem is to simply round to the nearest int after all the intermediate calculations.

+5
source

This is called numerical math. Computers do not work with real numbers, but in their approximation, and this approximation leads to rounding errors, such as this one.

For 0.05, 0.06 and 0.08, you are lucky that they have an exact floating point representation on your PC. This does not apply to 0.07. Please note that on another PC / browser, the result may be different.

See the IEEE floating point format specification for details.

+2
source

All Articles