Many threads have been launched for confusion about how Math.Round works. For the most part, they are answered by binding people to the MidpointRounding parameter and that most people expect MidpointRounding.AwayFromZero . I have one more question, though about the real algorithm implemented by AwayFromZero.
Given the following number (result of a series of calculations): 13.398749999999999999999999999M
our users expect to see the same result that Excel will provide them 13.39875 . Since they are currently rounding this number to 4 using Math.Round (num, 4, MidpointRounding.AwayFromZero), the result is disabled by .0001 from the expected. Presumably, the reason for this is because the algorithm just looks at the fifth digit (4), and then rounds accordingly. If you start rounding off the last 9, the real math answer will actually give you the same amount as Excel.
So the question is, is there ... is there a way to imitate this behavior, not the current one?
I wrote a recursive function that we could use at the same time. But before we put it into production, I wanted to see what SO thought of the problem :-)
private decimal Round(decimal num, int precision) { return Round(num, precision, 28); } private decimal Round(decimal num, int precision, int fullPrecision) { if (precision >= fullPrecision) return Math.Round(num, precision); return Round(Math.Round(num, fullPrecision), precision, --fullPrecision); }
Edit: just for clarity, I should have been clearer in my original post. The rounding methodology proposed here is what I present to business analysts and users who report a rounding error. Despite the fact that he has repeatedly said that he is not wrong, just different from the expected ... this report continues to arrive. So Iβm just going to collect data in order to collect as much information as possible on this topic to inform users.
In this case, it seems that any other system used to generate these average prices (which we must compare) uses a different level accuracy (10 in the database, and excel does not exceed 15 by default or something else). Given that everyone has a different level of accuracy, I'm stuck in the middle with the question of switching to lower accuracy, some strange rounding rules (as described above), or just having different results than users expected.