Proving a limit exists with probability $1$ and computing its value

471 Views Asked by At

Let $X_{1}, X_{2}, \ldots$ be a sequence of i.i.d. random variables with the uniform distribution on $[0, 1]$.

Prove that the limit

$$\lim_{n\to\infty} \sqrt[n]{X_{1} X_{2} \cdots X_{n}} $$

exists with probability one and find its value.

I'm studying for an exam and have been stuck on this problem for quite some time now. I've really got no clue how to prove the limit exists with probability one. I tried to use lots of inequalities like Markov's, Chebyshev's etc, with no look.

So assuming the limit exists for now, I looked at various ways to compute products of uniform random variables. It seems like there is some discussion on this site but all of it seems to be a lot of computation. I'm not sure if there's any one nice way to solve this problem, so I am searching your help.

Thanks


Taking the log of the limit we get

$$\lim_{n\to\infty} \sum_{k=1}^{n} \log(X_{k})$$

By law of large numbers, this goes to

$$E(\left(\log(X_1\right))= \int_{0}^{1} \log x\mathop{dx} = -1 $$

So the answer is $1/e?$

Does this prove that the limit exists w.p. $1?$ I'm not so sure how to do that.

1

There are 1 best solutions below

2
On

Your edit is almost correct; note that $$\log(X_1X_2\dots X_n)^{\frac{1}{n}}= \frac{1}{n}\sum_{i=1}^n \log(X_i)$$

Since the variables $Y_i=\log(X_i)$ are independent and identically distributed, the Strong Law of Large Numbers guarrante that $$\frac{1}{n}\sum_{i=1}^n Y_i=\frac{1}{n}\sum_{i=1}^n \log(X_i)\longrightarrow E(\log(X_1))=-1$$

and the convergence is almost sure. So $$(X_1X_2\dots X_n)^{\frac{1}{n}}\longrightarrow\frac{1}{e}$$

almost surely.