Approximation error upper bound for variable derived through geometric mean

30 Views Asked by At

This question regards discovering an upper bound to the error in an approximation derived using the geometric mean.

I have two sequences of latent variables:

$$ r_1, r_2, ..., r_n$$ $$ s_1, s_2, ..., s_n$$

These latent variables combine to produce the observed variables, $R$ and $S$ (where $k$ is a scalar):

$$ R = (1+kr_1)(1+kr_2)..(1+kr_n)$$ $$ S = (1+ks_1)(1+ks_2)..(1+ks_n)$$

The true quantity, which is not observable, that I am interested in knowing is:

$$ T = (1+k(r_1+s_1))(1+k(r_2+s_2))..(1+k(r_n+s_n)) $$

In order to approximate $T$, I derive a geometric mean of the $r_i$ and $s_i$ by:

$$ (1+kr)^n = R \implies r = \frac{R^{1/n}-1}{k} $$ $$ (1+ks)^n = S \implies s = \frac{R^{1/n}-1}{k} $$

My approximation for $T$, $\bar{T}$, is:

$$ \bar{T} = (1+k(r+s))^n = (R^{1/n} + S^{1/n} - 1)^n$$

I would like to derive an upper bound, $U_b$ for the error, so that:

$$ | T - \bar{T} | \leq U_b $$

What can be said about $U_b$?