Proving that the Lindeberg condition Fails

806 Views Asked by At

I am having trouble proving the following result thanks for any help in advance:

Let {$X_{n}, n\geq 1$} be a sequence of independent random variables with $P(X_{n} = -n) = 1-1/n^{2}$ and $P(X_{n} = n^{3} - n) = 1/n^{2}, n \geq 1$ Prove that {$X_{n}, n\geq1$} does not obey the Lindeberg condition.

There are a few things that I have noticed:

1.$\sum_{j = 1}^{n} X_{j} / n \rightarrow -\infty$ almost certainly

2.$EX_{n}^{2} /\sum_{j = 1}^{n}EX_{j}^{2}\rightarrow 0$ and $\sum_{j = 1}^{n}EX_{j}^{2} \rightarrow \infty$, so if I assume the Lindeberg condition is satisfied then by the Lindeberg-Feller CLT there will be some contradiction stemming from $\sum_{j = 1}^{n} X_{j}/\sqrt(\sum_{j = 1}^{n}EX_{j}^{2})$ converging in distribution to a $\mathcal{N} (0,1)$ random variable.

  1. I imagine that I am trying to get some contradiction with 1.