Consider an algorithm with a two-sided additive error, i.e. the algorithm outputs an estimator $\hat Y$ for some value $Y$, s.t. $$ (1 - \epsilon) Y \le \hat Y \le (1 + \epsilon) Y $$ which is equivalent to the following one-sided multiplicative error $$ Y \le \frac {1} {1 - \epsilon} \hat Y \le \frac {1 + \epsilon} {1 - \epsilon} Y $$ Is this possible to show that for a small constant $\epsilon$, the following holds? $$ \frac {1 + \epsilon} {1 - \epsilon} \le 1 + 2 \epsilon $$
Best regards
To first order it is correct, but not in general. For $|\epsilon| \lt 1$ you have $\frac 1{1-\epsilon}=1+\epsilon+\epsilon^2+\epsilon^3+\ldots$ from the sum of a geometric series. If you ignore terms of order $\epsilon^2$ and higher you have $\frac {1+\epsilon}{1-\epsilon}=1+2\epsilon$ but the higher order terms spoil it. As a numerical example, take $\epsilon=0.1$. Then $\frac {1+0.1}{1-0.1}=\frac{1.1}{0.9}=1.2222\overline2$. The first $2$ is the one you want in your equation. The rest come from the other terms. In practice the inequality is the other way, but it may be close enough to equality for what you are doing.