I have this problem on which I am stuck and dont know how to continue. The problem is as follows:
"Let $X_1,X_2,...X_n$ be a sample with mean $\mu$ and variance $\sigma^2$ and consider the linear estimator $ \displaystyle L_n = \> \sum_{k=1}^{n}a_kX_k$
where the $a_k$ are nonnegative and sum to 1. (a) Show that $L$ is unbiased, (b) show that $L$ is consistent."
So, part (a) was pretty straight forward. It is part (b) I'm struggling with.
I have this proposition in my text book that says that if $Var(L) \longrightarrow 0$ as $n \longrightarrow \infty$ then $L$ is consistent. The proof makes use of the Chebyshev inequality.
So, my attempt goes like this:
$Var(L) = Var(\sum_{k=1}^{n}a_kX_k) = \sum_{k=1}^{n}Var(a_kX_k)$ because of the individual samples independence. Moreover,
$Var(L) = \sum_{k=1}^{n}Var(a_kX_k) = \sum_{k=1}^{n}a_k^2Var(X_k) = \sigma^2\sum_{k=1}^{n}a_k^2$
and now I dont know how to continue. I dont know how $\sum_{k=1}^{n}a_k^2$ behaves as $n \longrightarrow \infty$. All I know is that $\sum_{k=1}^{n}a_k = 1$, and obviously this should be enough to solve the problem. I dont see however how I can use this in my attempted solution above.
I was hoping someone could give me a hint of how to continue from here.
Thanks!