Convergence in mean-square of independent random variables

615 Views Asked by At

I have a question concerning the convergence in mean square of a sequence of independent random variables: Let $\left(X_{n} \right)_{n \geq 1}$ be a sequence of non-negative independent random variables satisfying $E{\left[X_{n} \right] } = 1/n^{2}$. Set $$ Y_{n} = \sum_{k=1}^{n}X_{k} \quad \text{and} \quad Y = \sum_{k=1}^{+\infty}X_{k}.$$ By using the Monotone convergence theorem, one can easily show that $$ E{\left[ Y - Y_{n} \right] } = E {\left[\sum_{k=n+1}^{\infty}X_{k}\right] } = \sum_{k=n+1}^{\infty}E{\left[ X_{k}\right]} = \sum_{k=n+1}^{\infty}{\frac{1}{k^{2}}} \to 0 \text{ as } n\to +\infty. $$ and, therefore, $Y_{n}$ converges to $Y$ in mean. Observe that we do not need the independence of the $X_{n}$'s in our agrument here.

Here is my question: Does $Y_{n}$ converge to $Y$ in mean square? (assume that the sequence $\left( X_{n} \right)$ is independent). I guess the answer is negative and I am trying to find an example to show that $Y$ does not belong to $L^{2}$ in general (my goal so far is to find a sequence such that the sum of variances of $X_{n}$ goes to $+\infty$). That's my attempt so far in figuring out this question.

Any help would be appreciated. Thank you in advance.

1

There are 1 best solutions below

0
On

In this question, does $Y_n$ converge almost surely to $Y$?can we say $P(\sum_{i=n+1}^{\infty} X_i>\epsilon)<\dfrac{E(\sum(X_i)}{\epsilon}=\dfrac{1}{\epsilon}\sum_{i=n+1}^{\infty}\dfrac{1}{i^2}$ is 0 and then $\sum_{n=1}^{\infty}p(\sum_{i=n+1}^{\infty}> \epsilon)<\infty$ therefore$\ Y_n$ converges to $Y$ almost surely by Borel Cantelli Lemma?