How one would calculate an error from division of two rations?
I am given 1/y and x/y as their decimal representation (numbers a and b).
Then I perform the calculation b / a, trying to find the value of x.
Both this representaitons have errors e[1] and e[2].
So 1/y = a + e[1] and x/y = b + e[2]
then b/a = e[3]
How can I control the size of e[3] in terms of e[1] and e[2]?
One usually uses first order differentials for this, so $e[3]\approx \frac d{db}(\frac ba) e[2]+\frac d{da}(\frac ba) e[1]=\frac 1ae[2]-\frac b{a^2}e[1]$ and you want to add, not subtract, the two terms (I didn't include absolute value signs).
Added: sometimes it is easier to work in relative error instead of absolute. In this case we get $\frac {e[3]}{\frac ba} \approx \frac {e[2]}b+\frac {e[1]}{\frac ba}$, which shows the relative error in your division is the sum of the relative errors in the input quantities. Unfortunately, in financial calculations you are generally working in fixed point, so the errors are absolute, not relative. You can certainly go through the original answer to bound the absolute error, but it will depend on the input quantities as shown.