Lyapunov CLT for dependent random variables

95 Views Asked by At

Suppose $\{X_{1},\ldots ,X_{d}\}$ is a sequence of independent random variables, each with finite expected value $ E[X_{i}]$ and variance $ \operatorname{Var}[X_{i}]$. We define $$s_{d}^2 = \sum_{i=1}^{d} \operatorname{Var}[X_{i}].$$ If for some $\delta >0$, the following Lyapunov’s condition holds true \begin{align*} &\lim_{d \to \infty} \frac{1}{s_{d}^{2+\delta}} \sum_{i=1}^{d} E\left[ |X_{i} - E[X_i]|^{2+\delta}\right] = 0, \qquad \text{then},\\ &\frac{1}{s_{d}} \sum_{i=1}^{d} (X_{i} - E[X_{i}]) \overset{\mathcal{D}}{\to} \mathcal{N}(0,1), \end{align*} as $d$ tends to infinity. Here $\overset{\mathcal{D}}{\to}$ signifies the convergence in distribution.

Is there any similar kind of Lyapunov's result that exists for dependent random variables $\{X_1,…, X_d\}$? Any help or lead would be highly appreciated.