$X_n$ have unif. dist. $(0,1)$, $|p| < 1$. $Y_n = \frac{1}{n} \left( \frac{X_1^p}{X_2^p} + (...) + \frac{X_{n-1}^p}{X_{n}^p} \right) $ is conv. a. s.

95 Views Asked by At

I've been working with exercises on convergence of series of random variables and found such a problem.

$X_1, X_2, X_3, (...), X_n$ are independent random variables with uniform distribution on $(0,1)$. Show that for $|p| < 1$ a series of random variables defined as:

$$Y_n = \frac{1}{n} \left( \frac{X_1^p}{X_2^p} + \frac{X_2^p}{X_3^p} + (...) + \frac{X_n^p}{X_{n+1}^p} \right) $$

Is convergent almost surely and find $ \displaystyle x = \lim_{n \to \infty} Y_n $.


I will use the Kolmogorov's strong law of large numbers for series of variables with same distribution.

For that, variables $X$ in such a way to form other variables that would be independent. I have that:

$$Y_n = \frac{1}{n} \left( \frac{X_1^p}{X_2^p} + \frac{X_2^p}{X_3^p} + (...) + \frac{X_n^p}{X_{n+1}^p} \right) = $$ $$ =\frac{1}{n} \left( \frac{X_1^p}{X_2^p} + \frac{X_3^p}{X_4^p} + (...) \right) + \left( \frac{X_2^p}{X_3^p} + \frac{X_4^p}{X_5^p} + (...) \right)$$

Additionally, I know that, for independent random variables $A$, $B$: $\mathbb{E}(AB) = \mathbb{E}(A) \mathbb{E}(B)$. Therefore:

$$\lim_{n \to \infty} \mathbb{E}(|Y_n|) = \lim_{n \to \infty} \mathbb{E} \left( \Big| \frac{1}{n} \left( \frac{X_1^p}{X_2^p} + \frac{X_2^p}{X_3^p} + (...) + \frac{X_n^p}{X_{n+1}^p} \right) \Big| \right) = $$

$E$ is linear and all $X_n$ have positive values, so we get:

$$ = \lim_{n \to \infty} \mathbb{E} \left( \frac{1}{n} \left( \frac{X_1^p}{X_2^p} + \frac{X_3^p}{X_4^p} + (...) \right) \right) + \mathbb{E} \left( \frac{1}{n} \left( \frac{X_2^p}{X_3^p} + \frac{X_4^p}{X_5^p} + (...) \right) \right) =$$

For independent random variables $A$, $B$ : $ \ \mathbb{E}\left[ \left( \frac{A}{B} \right)^p \right] = \mathbb{E}[(A)^p] \cdot \mathbb{E}[(B)^{-p}]$, therefore:

$$= \lim_{n \to \infty} \left( \frac{1}{n} \left( \mathbb{E} (X_1^p) \cdot \mathbb{E} (X_2^{-p}) + (...) \right) \right) + \lim_{n \to \infty} \left( \frac{1}{n} \left( \mathbb{E} (X_2^p) \cdot \mathbb{E} (X_3^{-p}) + (...) \right) \right) = $$

From the law of the unconscious statistician, we have that for $|p| < 1$ and $i \in [1, n]$:

  • $\mathbb{E} (X_i^p) = \int_0^1 x_i^p \ dx = \frac{1}{p+1}$
  • $\mathbb{E} (X_i^{-p}) = \int_0^1 x_i^{-p} \ dx = \frac{1}{-p+1}$

(because our PDF for uniform distribution is equal to $1$).

$$= \lim_{n \to \infty} \left( \frac{1}{n} \left( \frac{1}{p+1} \cdot \frac{1}{-p+1} + (...) \right) \right) + \lim_{n \to \infty} \left( \frac{1}{n} \left( \frac{1}{p+1} \cdot \frac{1}{-p+1} + (...) \right) \right) =$$

$$= \lim_{n \to \infty} \left( \frac{1}{n} \left( \frac{1}{1-p^2} + (...) \right) \right) + \lim_{n \to \infty} \left( \frac{1}{n} \left( \frac{1}{1-p^2} + (...) \right) \right) =$$

In each of $2$ parts we have $\frac{n}{2}$ elements, therefore:

$$= \lim_{n \to \infty} \left( \frac{1}{n} \cdot \frac{n}{2} \cdot \frac{1}{1-p^2} \right) + \lim_{n \to \infty} \left( \frac{1}{n} \cdot \frac{n}{2} \cdot \frac{1}{1-p^2} \right) =$$

$$= \lim_{n \to \infty} \left( \frac{1}{2(1-p^2)} \right) + \lim_{n \to \infty} \left( \frac{1}{2(1-p^2)} \right) =$$ $$= \frac{1}{2(1-p^2)} + \frac{1}{2(1-p^2)} =$$ $$= \frac{2}{2(1-p^2)} =$$ $$= \frac{1}{1-p^2}$$

From that we see that $x = \frac{1}{1-p^2}$.