Let be $X_{1},X_{2},...$ a sequence of independen random variables with $E(x_i)=0$ and $Var(x_i)=\sigma^{2}$, also $\lim_{n \to \infty} \sum _{i=1}^{n} \frac{\sigma_{i}^{2}}{n^2} = 0$.
Proof that $ \forall \epsilon \ $ $ P \left(\frac{ |X_{1}+...+X_{n}|}{n} > \epsilon \right) \rightarrow 0$
My answere is:
Let be $Y=\sum_{i=1}^{n} X_{i} \implies E(Y)=nE(X_i)=n*0=0 ; \ Var(Y)=nVar(X_i)=n*\sigma_{i}^{2} $
With the chebyshev´s inequality we have:
$P(|Y-0| \geq \epsilon ) \leq \frac{n \sigma_{i}^2}{\epsilon^{2}}$ , let be $\epsilon=n \epsilon$
$P\left( \frac{|Y|}{n} \geq \epsilon \right) \leq \frac{1}{n} \frac{\sigma_{i}^{2}}{\epsilon^2}$
If we take limit, $\lim_{n \to \infty } \frac{1}{n} = 0$
$\implies \ P \left(\frac{ |X_{1}+...+X_{n}|}{n} > \epsilon \right) \rightarrow 0 $
But I don't know what's up with the summation hypothesis.
I need to use this result to proof if $X_{i} \sim Ber(p_{i})$ then for $ \forall \epsilon P\left( | \frac{X_1+...+X_{n}}{n} - p(n) | \leq \epsilon \right) \to 1$ when $n \to \infty$ and $p(n)= \frac{\sum_{i=1}^{n}p_i }{n}$
With the hint of @Alan I think that the answere is:
Let be $Y_{i}=X_{i}-p_{i}\ $ then $ \ E(Y_{i})=E(X_{i})-p_{i}=p_i - p_i = 0$
Then $Var(Y_i)=\sigma_{i}^{2}$
We have for the property $P\left( \frac{|Y_1+...+Y_n|}{n} > \epsilon \right) \to 0$
$\implies P\left( \frac{|X_1+...+X_n - (p_1 + ... + p_n)|}{n} > \epsilon \right) \to 0 $ also we have that $n>0 \implies |n|=n$ $\implies P\left( |\frac{X_1+...+X_n - (p_1 + ... + p_n)}{n}| > \epsilon \right) \to 0 $
Finally we get complement:
$\implies P\left( |\frac{X_1+...+X_n }{n} - p(n)| \geq \epsilon \right) \to 1 $