I have this code;
static int test = 100; static int Test { get { return (int)(test * 0.01f); } }
: 0
But this code returns different
static int test = 100; static int Test { get { var y = (test * 0.01f); return (int)y; } }
: one
I also have this code
static int test = 100; static int Test { get { return (int)(100 * 0.01f); } }
: one
I am looking at the output of IL, and I do not understand why C # performs this mathematical operation at compile time and outputs different? 
What is the difference between these two codes? Why did I decide to use a variable result that changes?
source share