Application of CLT for bounded random variables with slowly decreasing variance

297 Views Asked by At

I am trying to solve the following exercise:

Let $(X_k)_{k\in \mathbb N}$ be a sequence of independent random variables and $M> 0$ such that $$\forall n\in \mathbb N: |X_n| \leq M$$ and $\sum_{k\in \mathbb N}\mathrm{Var}(X_k) = \infty$. Define $S_n = \sum_{k=1}^n X_k$. Prove that $$\frac{S_n-\mathbb E[S_n]}{\sqrt{\mathrm{Var}(S_n)}} \longrightarrow \mathcal N(0,1)$$ in distribution.

Attempt:
I don't think I can solve this using the "standard" CLT for i.i.d. random variables, since I don't know if it's possible to make the $X_n$ to be identically distributed.
So I try to use the Lindeberg-CLT. To that end, define $\left(\left(X_{n,k}\right)_{k=1}^n \right)_{n\in \mathbb N}$ by $$X_{n,k} := \frac{S_k - \mathbb E[S_k]}{\sqrt{\mathrm{Var}(S_n)}}.$$ One easily sees that this defines an independent, standardized triangular array. So if the Lindeberg condition holds we may conclude that $$\sum_{l=1}^n X_{n,l} \longrightarrow \mathcal N(0,1).$$ Now I don't know if that is helpful or not, and I was unable to prove the Lindeberg condition so far. Is this even going in the right direction?

1

There are 1 best solutions below

2
On BEST ANSWER

Hint: in order to check Lindeberg's condition, we have to treat terms of the form $\mathbb E\left[Y_j^2\mathbf 1\left\{ Y_j^2\gt \varepsilon \operatorname{Var\left(S_n\right)}\right\}\right]$, where $Y_j:=X_j-\mathbb E\left[X_j\right]$. Note that $\left\lvert Y_j\right\rvert$ is bounded by $2M$ hence what can you say about $\mathbf 1\left\{ Y_j^2\gt \varepsilon \operatorname{Var\left(S_n\right)}\right\}$ when $n$ is sufficiently large?