I'm looking at code that does the following:
startRoundedValue = Round(startValue, 2); //rounds startValue to two decimal places
endRoundValue = Round(endValue, 2); //rounds endValue to two decimal places
intermediateRoundingResult = (endRoundValue - startRoundedValue) / startRoundedValue
startValue and endValue are decimal values (e.g. 123456.123456789012) with up 12 decimal places. Suppose I change the code to compute the result with rounding at the end:
unRoundedResult = (endValue - startValue)/startValue
roundingAtTheEndResult = Round(unRoundedResult, 2)
What would be the largest difference I could expect between intermediateRoundingResult and roundingAtTheEndResult?
We can ignore the case where startValue rounds to zero.
Update
For example, let's suppose startValue=0.05 and endValue=0.014999999999, I get roundingAtTheEndResult =2.33 and intermediateRoundingResult=4.00 for a difference of 1.67.
If instead, startValue=4002.0549 and endValue=18.019112, I get roundingAtTheEndResult=221.10 and intermediateRoundingResult=221.09 for a difference of 0.01.
I'm trying to determine max difference that is possible for any arbitrary startValue and endValue