Does Lindeberg's condition imply $s_n \to \infty$?

237 Views Asked by At

Lindeberg's theorem states that if we have a sequence of independent random variables $X_j: \Omega \to \mathbb{R}$ with zero mean, variance $\sigma_i^2$, and distribution $\alpha_i$, and we define $s_n^2 = \sigma_1^2 + \dots + \sigma_n^2$, then the distribution of $\frac{X_1+\dots +X_n}{s_n}$ converges weakly to the standard normal ${\bf if}$ for all $\varepsilon>0$, $$ \lim_{n\to \infty} \frac{1}{s_n^2}\sum\limits_{i=1}^n \int\limits_{|x| > \varepsilon s_n} x^2 d\alpha_i(x) = 0. \qquad (\star) $$ If $\phi_i$ is the characteristic function of some $X_i$, the proofs I see use some order of the Taylor expansion for $\log\phi_i\left(\frac{t}{s_n}\right)$ where $|t|<T$. I guess this tacitly assumes the $s_n$ diverges? However I could not verify if $(\star)$ implies this condition. Is this easy to see?

1

There are 1 best solutions below

0
On BEST ANSWER

Assume $s_n \to s$ with $s \in (0,\infty)$.

To start, suppose $\sigma_j>0$ for all $j$. Now look at $\varepsilon=\frac{1}{2} \frac{\sigma_1}{s}$. Then the first integral in the sum will remain bounded away from zero (since you will have $\varepsilon s_n \leq \frac{1}{2} \sigma_1$) and there will be no exploding factor in the denominator to send the whole thing to zero. So the Lindeberg condition will fail.

If $\sigma_1=0$, replace $\sigma_1$ in the above with whichever positive $\sigma_k$ you like.

If instead $s_n \to 0$ then $\sigma_j \equiv 0$, in which case the conclusion of the CLT doesn't hold (but this case is trivial, of course).