I need to solve the following problem.
Let $X_1,X_2,\dots$ be independent random variables all with expectation $0$ and variance bounded by $M$. Prove that $\frac{1}{n}\cdot \sum\limits_{k=1}^{n} X_{k}{\buildrel P \over \to}\hspace{1mm}0 $
$\underline{\text{SOLUTION}}$
$\phi_{X_{k}}(t) = 1 - \frac{E(X_{k}^2)}{2}\cdot t^2 + \mathcal{O}(t^2)$
By independence of the $X_i$'s, we get
$\phi_{\sum\limits_{k=1}^{n} X_{k}}(t) = \prod_{k=1}^{n} \bigg[1 - \frac{E(X_{k}^2)}{2}\cdot t^2 + \mathcal{O}(t^2)\bigg]$
And Hence,
$\phi_{\frac{1}{n}\sum\limits_{k=1}^{n} X_{k}}(t) = \phi_{\sum\limits_{k=1}^{n} X_{k}}(\frac{t}{n}) = \prod_{k=1}^{n} \bigg[1 - \frac{E(X_{k}^2)}{2}\cdot (\frac{t}{n})^2 + \mathcal{O}((\frac{t}{n})^2)\bigg] \tag{1}$
Obviously, now i have to show that $(1)\to \phi_0(t) = E(e^{it\cdot0}) = 1$.
But how would i do that?
Thanks in advance.
This follows directly from the weak law of large numbers. Finite variance is implied by bounded variance.