Take a look at the question below
for $ $ $ $ $f:[0,1]\to \Bbb{R}$ continuous :
show that $$ \lim_{n\to\infty}\idotsint_{0}^{1} f((x_{1} \cdot\cdot\cdot x_{n})^{\frac{1}{n}}) \,dx_1 \dots dx_n = f(\frac{1}{e})$$
solution:
Let $X_{i}\sim Uniform(0,1)$
One should assure themselves before we proceed that :
- $f$ is bounded
- $E(log(X_{i}))=-1$
- WLLN tells us (recalling the deff. of weak convergence) $$ \lim_{n\to \infty}E[f( \frac{S_{n}}{n})] = E[f(E(X_{i}))] $$ for $S_{n}=X_{1}+\cdot +X_{n} $ where $X_{i}$ are iid r.v with finite mean. And any continuous bounded function f.
Now we have
$$ \lim_{n\to\infty}\idotsint_{0}^{1} f((x_{1} \cdot\cdot\cdot x_{n})^{\frac{1}{n}}) \,dx_1 \dots dx_n $$
(by deff. of $X_{i}$ and the law of unconscious statistician)
$$ = \lim_{n\to \infty} E[f((X_{1} \cdot\cdot\cdot X_{n})^{\frac{1}{n}}) $$
$$ = \lim_{n\to \infty} E[f(e^{\frac{1}{n}(log(X_{1})+\cdots + log(X_{n}))})] $$
$$ = \lim_{n\to \infty} E[f(e^{\frac{S_{n}}{n}})] $$ (where $S_{n} = \sum_{i=1}^{n} log(X_{i}))$
$$ = \lim_{n\to \infty} E[f(e^{E(log(X_{i}))})] $$ (by WLLN and noting that f composed with e is still continuous (and bounded) on $(0,1)$)
$$ = f(\frac{1}{e}) $$ (since $E(log(X_{i}))=-1$ )
Does anyone know if i could use the same strategy to solve:
show that $$ \lim_{n\to\infty}\idotsint_{0}^{1} f(\frac{\,(x_1)^{2}+ \dots +(x_n)^{2}} {\,x_1+ \dots +x_n } ) \,dx_1 \dots dx_n = \frac{2}{3}$$