Math.Round Methodology Starting with the Least Decimal

Many threads have been launched for confusion about how Math.Round works. For the most part, they are answered by binding people to the MidpointRounding parameter and that most people expect MidpointRounding.AwayFromZero . I have one more question, though about the real algorithm implemented by AwayFromZero.

Given the following number (result of a series of calculations): 13.398749999999999999999999999M

our users expect to see the same result that Excel will provide them 13.39875 . Since they are currently rounding this number to 4 using Math.Round (num, 4, MidpointRounding.AwayFromZero), the result is disabled by .0001 from the expected. Presumably, the reason for this is because the algorithm just looks at the fifth digit (4), and then rounds accordingly. If you start rounding off the last 9, the real math answer will actually give you the same amount as Excel.

So the question is, is there ... is there a way to imitate this behavior, not the current one?

I wrote a recursive function that we could use at the same time. But before we put it into production, I wanted to see what SO thought of the problem :-)

  private decimal Round(decimal num, int precision) { return Round(num, precision, 28); } private decimal Round(decimal num, int precision, int fullPrecision) { if (precision >= fullPrecision) return Math.Round(num, precision); return Round(Math.Round(num, fullPrecision), precision, --fullPrecision); } 

Edit: just for clarity, I should have been clearer in my original post. The rounding methodology proposed here is what I present to business analysts and users who report a rounding error. Despite the fact that he has repeatedly said that he is not wrong, just different from the expected ... this report continues to arrive. So I’m just going to collect data in order to collect as much information as possible on this topic to inform users.

In this case, it seems that any other system used to generate these average prices (which we must compare) uses a different level accuracy (10 in the database, and excel does not exceed 15 by default or something else). Given that everyone has a different level of accuracy, I'm stuck in the middle with the question of switching to lower accuracy, some strange rounding rules (as described above), or just having different results than users expected.

+7
math c # rounding
source share
2 answers

If I'm right, people expect 13.3988 because they are first rounded to 13.39875 and then to 13.3988, and they need you to be error-compatible.

If this is the case, there is no need to repeat one more rounding step, since the lack of their method comes only at the last rounding stage (by its nature rounding eliminates the significance of the two steps before it).

 private static decimal InaccurateRound(decimal num, int precision) { return Math.Round( Math.Round(num, precision + 1, MidpointRounding.AwayFromZero), precision, MidpointRounding.AwayFromZero); } 
+3
source share

So, if you round this number to 5 places, and then round the result to 4 places ... Do you get a different result than if you round the original number to 4 places? It was expected. And I think this explains what you need to do. another option is for Excel to display full accuracy, so "they" match your code. Rounding twice seems wrong.

+3
source share

All Articles