I see something strange with preserving doubles in the dictionary, and I do not understand why.
Here is the code:
Dictionary<string, double> a = new Dictionary<string, double>(); a.Add("a", 1e-3); if (1.0 < a["a"] * 1e3) Console.WriteLine("Wrong"); if (1.0 < 1e-3 * 1e3) Console.WriteLine("Wrong");
The second if statement works as expected; 1.0 not less than 1.0. Now the first if statement evaluates to true. It is very strange that when you hover over if, intellisense tells me false, but the code successfully moves to Console.WriteLine.
This is for C # 3.5 in Visual Studio 2008.
Is this a floating point precision issue? Then why does the second if statement work? I feel like I'm missing out on something very fundamental here.
Any insight is appreciated.
Edit2 (slightly changed the question):
I can accept the mathematical problem of accuracy, but my question now is: why is the freeze properly evaluated? This is also true for the immediate window. I paste the code from the first if statement into the immediate window and it evaluates to false.
Update
First of all, many thanks for all the great answers.
I am also having trouble recreating this in another project on the same machine. Looking at the project settings, I see no differences. Looking at IL between projects, I see no difference. Looking at the showdown, I do not see any visible differences (except for memory addresses). However, when I debug the original project, I see: 
The next window reports that if is false, but the code falls into conditional.
In any case, I think the best answer is to prepare for floating point arithmetic in these situations. The reason I could not allow this was due to debugger calculations that were different from runtime. So many thanks to Brian Gideon and Stephentiron for some very insightful comments.