Variable acting weird (e.g. floating point)

I have this code;

static int test = 100; static int Test { get { return (int)(test * 0.01f); } } 

: 0

But this code returns different

 static int test = 100; static int Test { get { var y = (test * 0.01f); return (int)y; } } 

: one

I also have this code

  static int test = 100; static int Test { get { return (int)(100 * 0.01f); } } 

: one

I am looking at the output of IL, and I do not understand why C # performs this mathematical operation at compile time and outputs different? enter image description here

What is the difference between these two codes? Why did I decide to use a variable result that changes?

+6
source share
1 answer

Because the compiler is cheating on you. The compiler is smart enough to do some math, so it doesn't need to do this at run time, which would be pointless. The expression 100 * .01f calculated by the compiler without a lack of precision for the float, which breaks you into runtime.

To prove this, try doing a static test a const . You will see that the compiler can also do the math for you at compile time. It has nothing with writing the variable first, as in your example. Runtime and compilation time.

+2
source

All Articles