Show that $\frac{S_n - \mathbb{E}S_n}{\sigma \sqrt{n}} \to \mathcal{N}(0,1)$ in distribution.

101 Views Asked by At

Let $\left(X_{n}\right)$ be a sequence of independent random variables such that $\operatorname{Var}(X_n) = \sigma^2$ for every $n$. Assume that $1 \leq k \leq n,$ there exists $c>0$ such that $X_{k}-\mathbb{E}\left[X_{k}\right] \leq c$ almost surely

$$S_{n}=\sum_{k=1}^{n} X_{k} \quad \text { et } \quad v_{n}=\sum_{k=1}^{n} \operatorname{Var}\left(X_{k}\right) = n\sigma^2$$

Show that if $\mathbb{P}\left(S_{n}-\mathbb{E}\left[S_{n}\right] \geq x\right) \leq \exp \left(-\frac{x^{2}}{2\left(v_{n}+x c / 3\right)}\right)$ then $\frac{S_n - \mathbb{E}S_n}{\sigma \sqrt{n}}$ converges to a standard normal variable.

assume also that $x = o(n)$.

normally I would just apply the limit central theorem directly but the fact that I wouldn't be using all the assumptions makes me think it can't be done in a straightforward fashion.

any help will be greatly appreciated.