The following script contains a very strange error. I want to check if the value is a positive integer. To do this, I multiply by 100 to enclose the value in decimal. If I test 0.07, the script did not calculate the value 7, but the value 7.00000001. I could round the value, but I would like to know why the value is calculated this way.
<script type="text/javascript"> var isPositiveInt = function(i) { i = i*100; return ((i % 1) == 0 && i >= 0); }; </script> <a href="#" onclick="alert(isPositiveInt('0.07')); return false;">Try it out!</a>
0.05, 0.06 and 0.08 works great. But what happens with 0.07? I would be happy if someone explained this to me.
source share