Your mistake is that you think the result is 0.123 , whereas 0 .
C # (and many other C-like languages) determines that operations with two integers always return an integer, so 1 + 1 will return 2 , not 2.0 and 3/2 will return 1 , not 1.5 . In this case, the fractional part is simply discarded, therefore it is always rounded to zero (i.e., it is rounded for positive results, rounded for negative results).
Although this is perhaps a little contrary to intuition, the main reason for this is the simplicity of the language / compiler, the speed of execution (because you do not need to determine what type of result it has), the ability to use operators such as /= which will not work if the result is different type) and historical heritage and inertia.
To solve this problem, you need to make at least one of your operands a floating point number (the other will automatically follow it, as well as the result):
// either if (123.0 / 1000 > 0) // or if (123 / 1000.0 > 0) // or if (123.0 / 1000.0 > 0)
If you have variables, you might need typecast (since you can't just add .0 : -)):
if ((double)a / b > 0)
And the usual recommendations here are true: when programming, rarely trust your intuition, because computers are strange machines and programming languages, sometimes even unfamiliar ones. Listing the result somewhere or assigning it to a variable and checking it in the debugger would show that your expectations were not :-)
source share