The converse on the Central Limit Theorem

367 Views Asked by At

The problem is the following: let $X_n$ be i.i.d r.v's and $\{b_n\}$ be a sequence of positive real numbers s.t. $b_n \longrightarrow \infty$ and $$ (I) \qquad \frac{\sum_{k=1}^n X_k }{b_n} \Rightarrow\mathcal{N}(0,1), $$ where $\Rightarrow$ denotes convergence in distribuition.

Does this imply that $X_n$ has finite second moment? My intuition makes me think so, but I am not sure.

I will show my idea in a very informal way, because I am actually struggling to prove a few passages, but at the end I will mention exactly what passages are being troublesome to me.

First, for a $m \in \mathbb{N}$, we divide my sum in independent blocks that converge to the same distribution, $$ (II) \qquad \frac{\sum_{k=1}^{mn} X_k }{b_{mn}} = \frac{b_{n}}{b_{mn}} \frac{\sum_{k=1}^{n} X_k }{b_{n}} + \cdots + \frac{b_{n}}{b_{mn}}\frac{\sum_{k=(m-1)n}^{mn} X_k }{b_{n}}. $$

As those blocks are independent, I could write the limit in the RHS as $$ \qquad\beta_m \xi_1+\cdots+ \beta_m \xi_m $$ where $\beta_k := \lim_{n \to \infty}\frac{b_{n}}{b_{mn}}$ and $\xi_i \sim \mathcal{N}(0,1)$. But evaluating the limit in the LHS of $(II)$, I have that $ \beta_m \xi_1+\cdots+ \beta_m \xi_m \sim \mathcal{N}(0,1) $ hence the $\beta_m = m^{1/2}$. Therefore I would have that

$$ (III)\qquad\lim_{n \to \infty}\frac{b_{n}}{b_{mn}} = \frac{1}{\sqrt{m}} $$

Which I believe that implies $$ (IV)\qquad b_n = \sigma\sqrt{n}+o(1). $$

So finally, we are basically reduced to the the usual form $$ \frac{\sum_{k=1}^n X_k }{\sqrt{n}} \Rightarrow\mathcal{N}(0,\sigma ), $$

For $S_n\sum_{k=1}^n X_k$, we have that $$ E\Big[\frac{S^2_n}{n}\wedge M \Big] \longrightarrow E[\sigma \xi^2 \wedge M] \text{ as } n \longrightarrow \infty $$ for all $M>0$, but the RHS is always smaller than $1$. And if $E[X_k^2]=\infty$, by monotone convergence $$ E\Big[\frac{S^2_n}{n}\wedge M \Big] \longrightarrow E[X_k^2] \text{ as } M \longrightarrow \infty $$

Which, with a bit of care can be show to be a contradiction, hence $E[X_k^2]<\infty$.

So to the parts that I am finding difficult:

  • Does the limit $(II)$ for $(I)$ to make sense?
  • If not, does it exist in the case that $b_n e^{cn} \longrightarrow $ for all $c>0$?
  • Does $(III)$ imply $(IV)$?
1

There are 1 best solutions below

0
On BEST ANSWER

No, in general $X_n$ does not need to have a finite second moment. You can find the following statement in Kallenbergs book Foundations of modern probability (Theorem 4.17 or Theorem 5.17, depending on the edition).

Theorem Let $(X_n)_{n \in \mathbb{N}}$ be iid non-degenerate real-valued random variables. Then $$a_n \sum_{k=1}^n (X_k-m_n) \xrightarrow[d]{n \to \infty} N(0,1) \tag{1}$$ for some constants $m_n$ and $a_n$ if, and only, if the function $$L(x) := \mathbb{E}(X_1^2 1_{|X_1| \leq x})$$ is slowly varying at infinity.

Note that a slowly varying function does not need to be bounded, and therefore $(1)$ does, in general, not imply the finiteness of the second moment. Consider, for instance, iid random variables with distribution

$$\mu(dx) = c 1_{(1,\infty)}(x) \frac{1}{x^3} \, dx$$

where the normalizing constant $c>0$ is chosen in such a way that $\mu(\mathbb{R})=1$. Then

$$L(x) = \mathbb{E}(X_1^2 1_{|X_1| \leq x}) = c \int_1^x \frac{1}{x} \, dx = c \log x$$

for all $x \geq 1$, and therefore $L$ is slowly varying at infinity. Applying the above statement, we find that the central limit theorem $(1)$ holds for suitable sequences $a_n$ and $m_n$ (although $\mathbb{E}(X_1^2)=\infty$).

If you, however, know that $(1)$ holds for $a_n = 1/\sqrt{n}$ and $m_n=0$, then $\mathbb{E}(X_1^2) = 1 < \infty$; this is also shown in Kallenbergs book.