Convergence to normal distribution without i.i.d.

66 Views Asked by At

Suppose that $X_n$ are independent and $\mathbb{P}(X_m = m) = \mathbb{P}(X_m=-m) = \frac{1}{2m^2}$ and for $m\geq 2$ \begin{equation*} \mathbb{P}(X_m = 1) = \mathbb{P}(X_m=-1) = \frac{1-m^{-2}}{2}. \end{equation*} It is easy to see $\frac{\mathrm{Var}(S_n)}{n} \longrightarrow 2$ but I cannot prove $\frac{S_n}{\sqrt{n}} \Longrightarrow \mathcal{N}(0,1)$. Clearly the assumptions of Feller-Lindeberg theorem doesn't hold.

1

There are 1 best solutions below

0
On BEST ANSWER

Set $Z_m = X_{m} 1_{\{|X_m| \le 1\}} + 1_{\{X_m>1\}}-1_{\{X_m<-1\}}$. Then $(Z_i)_{i \in \mathbb{N}}$ is a sequence of i.i.d. random variables with $\mathbb{E}(Z_i) =0$ and $\mathrm{Var}(Z_i) =1$. Thus, we can conclude by the central limit law that $$\frac{1}{\sqrt{n}}\sum_{k=1}^n Z_k \Rightarrow \mathcal{N}(0,1).$$ Let $S_n^{(1)} := \sum_{k=1}^n Z_k$ and $S_n^{(2)}: = \sum_{k=1}^n R_k$ with $R_k = X_k - Z_k = X_m 1_{\{|X_m|>1\}}+(1_{\{X_m<-1\}}-1_{\{X_m>1\}})$. If we can show that $(\sqrt{n})^{-1} S_n^{(2)} \rightarrow 0$ e.g. in $L^1$, then this already implies convergence in distribution. By Slutsky's theorem we can can could that $$\frac{1}{\sqrt{n}} \sum_{k=1}^n X_k = \frac{1}{\sqrt{n}} S_n^{(1)} + \frac{1}{\sqrt{n}} S_n^{(2)} \Rightarrow \mathcal{N}(0,1)+0 =\mathcal{N}(0,1).$$ It remains to show that $(\sqrt{n})^{-1}\mathbb{E}[|S_n^{(2)}|] \rightarrow 0$. In fact, we have \begin{align} \frac{1}{\sqrt{n}} \mathbb{E}| S_n^{(2)} | &\le \frac{1}{\sqrt{n}} \sum_{k=1}^n \mathbb{E}(1+|X_m|) 1_{\{|X_m|>1\}} \le \frac{2}{\sqrt{n}} \sum_{k=1}^n \mathbb{E}|X_m| 1_{\{|X_m|>1\}} \\ & \le\frac{2}{\sqrt{n}} \sum_{k=1}^n \mathbb{E}|X_m| 1_{\{|X_m|>1\}} \le\frac{2}{\sqrt{n}} \sum_{k=1}^n \frac{1}{k} \ll \frac{\log(n)}{\sqrt{n}}, \end{align} where the last step can be established by using an integral comparison argument. (The sum is at most $1+\int_1^{n} \frac{1}{x} \, d x$.)