Math.Regulation of the methodology, starting from the smallest decimal point


There have been many threads started over the confusion in the way that Math.Round works. For the most part, those are answered by cluing people in to the MidpointRounding parameter and that most people are expecting MidpointRounding.AwayFromZero. I have a further question though about the actual algorithm implemented by AwayFromZero.

Given the following number (the result of a number of calculations): 13.398749999999999999999999999M

our users are expecting to see the same result that Excel would give them 13.39875. As they are currently rounding that number to 4 using Math.Round(num, 4, MidpointRounding.AwayFromZero), the result is off by .0001 from what they expect. Presumably, the reason for this is that the algorithm just looks at the fifth digit (4), and then rounds accordingly. If you were to start rounding at the last 9, the real mathematical answer would in fact give you the same number as excel.

So the question is ... is there a way to emulate this behavior rather than the current?

I've written a recursive function that we could use in the meantime. But before we put it in production I wanted to see what SO thought about the problem :-)

    private decimal Round(decimal num, int precision)
        return Round(num, precision, 28);

    private decimal Round(decimal num, int precision, int fullPrecision)
        if (precision >= fullPrecision) return Math.Round(num, precision);

        return Round(Math.Round(num, fullPrecision), precision, --fullPrecision);

Edit: just for clarity, I should have been clearer in my original post. The position of rounding methodology being asked for here is what I'm being presented by the business analysts and users who are reporting the "rounding error". Despite being told numerous times that it's not incorrect, just different than what they are expecting ... this report keeps coming in. So I am just on a data gathering stint to gather as much information as I can on this topic to report back to the users.

In this case, it seems that any other system used to generate these average prices (which we must match) are using a different level precision (10 in the database, and excel seems to default to 15 or something). Given that everyone has a different level of precision, I'm stuck in the middle with the question of moving to a lower precision, some weird rounding rules (as described above), or just having different results than the users expect.

If I get you right, people are expecting 13.3988 because they are first rounding to 13.39875 and then to 13.3988 and they need you to be bug-compatible with that.

If so, there's no need to repeat any further than one step of rounding, as the flaw in their method only comes in at the last step of rounding (by its nature, rounding removes the significance of the step two steps before it).

private static decimal InaccurateRound(decimal num, int precision)
  return Math.Round(
    Math.Round(num, precision + 1, MidpointRounding.AwayFromZero),
    precision, MidpointRounding.AwayFromZero);