What's happening?! Subtraction works fine until I get to 0.1 - 0.1. I am in visual C # 2008 using the nonoba.com API.
Console.WriteLine("hit! " + Users[targetNum].character.health + " : " + player.character.profile.attackPower); Users[targetNum].character.health -= player.character.profile.attackPower; Console.WriteLine("health! " + Users[targetNum].character.health);
exit:
hit! 0.1 : 0.1 health! 1.490116E-08
Thanks to everyone - I could use the decimal type, as I usually add / subtract good round numbers. For now, I just go in:
if (Users[targetNum].character.health <= 0.00001)
By the way, I knew that this would not be a “mistake” in C # - I thought that it would be either a mistake in my code, or some misunderstanding of what it was.
After reading all the recommended reading, I’m going to conclude that my stupidity is due to the fact that the ActionScript number type is usually used, which may have a decimal rather than a binary floating-point number - in any case, it will never give this result.
floating-point c #
Iain
source share