Proving convergence with probability 1

334 Views Asked by At

We have a sequence $X_1, X_2, \cdots$ of IID R.V's. We are also given that $E[\log(X_i)]$ exists. We define a new sequence of random variables in terms of the $X_i$'s:

  • $Y_n = (X_1X_2\cdots X_n)^{\frac{1}{n}}$.

We want to show that the sequence $Y_1,Y_2,\cdots$ converges almost surely.

I have no idea how to start this problem.

What I do recognize, although I'm not sure if it will help, is that since the $X_i$'s are IID, we could just consider $\lim_{n \to \infty} Y_n = \lim_{n\to \infty} (X_1\cdots X_1)^\frac{1}{n} = \lim_{n\to \infty} (X_1^n)^{\frac{1}{n}}$

Basically, I was thinking that as $n$ runs to $\infty$, since the $X_i$'s are IID they will all look identical in the long run and it would simplify into a simple term. I'm not sure if my thinking is correct though.

But this approach wouldn't use the given fact that the expectation of the log exists so I am certain I am wrong.

1

There are 1 best solutions below

2
On BEST ANSWER

$$\ln Y_n = n^{-1}\sum_{i=1}^n \ln X_i\xrightarrow {\text{a.s.}} \mathbb{E}\ln X_1$$