I'm trying to understand my teacher's solution to this problem:
Let $X_1,X_2,...$ sequence of iid random variables with expected value $\mu$. Define $Y_n = \frac{2}{n(n+1)} \sum_{j=1}^{n} jX_j$. Prove that $Y_n \to_\mathbb{P} \mu$.
His solution:
Let $\sigma^2 = \rm{Var}(X_i)$.
$E(Y_n) = \frac{2}{n(n+1)} \sum_{j=1}^{n} jE(X_n) = \mu$.
$\displaystyle\rm{Var}(Y_n) = \frac{4}{n^2(n+1)^2} \sum_{j=1}^{n} j^2 \rm{Var}(X_n) = \frac{4\sigma^2}{n^2(n+1)^2}\bigg(\frac{n(n+1)(2n+1)}{6}\bigg) = \frac{2\sigma^2}{3}\bigg(\frac{2n+1}{n(n+1)}\bigg) \leq \frac{2\sigma^2}{3}\bigg(\frac{2}{3}\bigg) = \sigma^2 ,\forall n$
Then it follows by Tchebychev's Weak Law of Large Numbers.
My doubt is in the last step. Isn't Tchebychev's WLLN only true for uncorrelated random variables? In this case, $\mathbb{E}(Y_n Y_m) \neq \mu^2$, since they aren't independent, so $\rm{Cov}(Y_n,Y_m) \neq 0$, right?
Thanks.
Edit: As u/clarinetist noted, there's a problem in this exercise, since the WLLN states the convergence of the arithmetic mean of random variables, whose expected value isn't $\mu$.
Note that the WLLN is based on an arithmetic mean of random variables.
Thus, what we should be really focusing on is
$$Y^{\prime}_j=\dfrac{2}{n+1}jX_j$$ from which $\bar{Y}_n = Y_n = \dfrac{1}{n} \sum_{j=1}^{n}Y^{\prime}_j$.
Observe $$\begin{align} \text{Cov}(Y_j^{\prime}, Y_k^{\prime}) &= \text{Cov}\left(\dfrac{2}{n+1}jX_j, \dfrac{2}{m+1}kX_k \right) \\ &= \dfrac{4}{(n+1)(m+1)}jk\text{Cov}(X_j, X_k) \\ &= 0 \end{align}$$ I question whether or not it's valid to use the WLLN here, since obviously $\mathbb{E}[Y^{\prime}_j] \neq \mu$ - actually, the $Y^{\prime}_j$ aren't even iid.