Convergence in probability of $Y_n = \frac{2}{n(n+1)} \sum_{j=1}^{n} jX_j$, given $X_1,X_2,...$ iid

189 Views Asked by At

I'm trying to understand my teacher's solution to this problem:

Let $X_1,X_2,...$ sequence of iid random variables with expected value $\mu$. Define $Y_n = \frac{2}{n(n+1)} \sum_{j=1}^{n} jX_j$. Prove that $Y_n \to_\mathbb{P} \mu$.

His solution:

Let $\sigma^2 = \rm{Var}(X_i)$.

$E(Y_n) = \frac{2}{n(n+1)} \sum_{j=1}^{n} jE(X_n) = \mu$.

$\displaystyle\rm{Var}(Y_n) = \frac{4}{n^2(n+1)^2} \sum_{j=1}^{n} j^2 \rm{Var}(X_n) = \frac{4\sigma^2}{n^2(n+1)^2}\bigg(\frac{n(n+1)(2n+1)}{6}\bigg) = \frac{2\sigma^2}{3}\bigg(\frac{2n+1}{n(n+1)}\bigg) \leq \frac{2\sigma^2}{3}\bigg(\frac{2}{3}\bigg) = \sigma^2 ,\forall n$

Then it follows by Tchebychev's Weak Law of Large Numbers.

My doubt is in the last step. Isn't Tchebychev's WLLN only true for uncorrelated random variables? In this case, $\mathbb{E}(Y_n Y_m) \neq \mu^2$, since they aren't independent, so $\rm{Cov}(Y_n,Y_m) \neq 0$, right?

Thanks.

Edit: As u/clarinetist noted, there's a problem in this exercise, since the WLLN states the convergence of the arithmetic mean of random variables, whose expected value isn't $\mu$.

2

There are 2 best solutions below

10
On BEST ANSWER

Note that the WLLN is based on an arithmetic mean of random variables.

Thus, what we should be really focusing on is

$$Y^{\prime}_j=\dfrac{2}{n+1}jX_j$$ from which $\bar{Y}_n = Y_n = \dfrac{1}{n} \sum_{j=1}^{n}Y^{\prime}_j$.

Observe $$\begin{align} \text{Cov}(Y_j^{\prime}, Y_k^{\prime}) &= \text{Cov}\left(\dfrac{2}{n+1}jX_j, \dfrac{2}{m+1}kX_k \right) \\ &= \dfrac{4}{(n+1)(m+1)}jk\text{Cov}(X_j, X_k) \\ &= 0 \end{align}$$ I question whether or not it's valid to use the WLLN here, since obviously $\mathbb{E}[Y^{\prime}_j] \neq \mu$ - actually, the $Y^{\prime}_j$ aren't even iid.

3
On

You teacher's solution was almost correct. You do not need any WLLN, their proof shows $Y_n\to \mu$ in probability directly.

In order to show $Y_n\to \mu$ in probability, it suffices to show the stronger statement that $\Bbb E(Y_n-\mu)^2\to 0$. Since $\mu=\Bbb EY_n$, this is equivalent to showing $\text{Var }Y_n\to 0$. Now, look where they wrote $$ \text{Var }Y_n=\dots=\frac23\sigma^2\frac{2n+1}{n(n+1)}\color{red}\le \frac23\sigma^2\left(\frac23\right) $$ The $\color{red}\le$ part was too sloppy of a bound; once they had $\text{Var }Y_n=\frac23\sigma^2\frac{2n+1}{n(n+1)}$, the fact that $\frac{2n+1}{n(n+1)}\to 0$ as $n\to\infty$ proves that $\text{Var }Y_n\to 0$, clinching the proof.