Counterexamples concerning the central limit theorem

951 Views Asked by At

The central limit theorem states that, if $X$ is a random variable with finite variance $\sigma^2$ and expected value $\mu$, and if $(X_n)$ is a sequence of independent random variables identically distributed like $ X $, then

\begin{equation} Z_n = {\frac{{\overline{X}}_n-\mu}{\sqrt{\sigma^2/n}}}\ \rightsquigarrow N(0,1), \end{equation}

where ${\overline{X}}_{n} = \frac{1}{n} \sum_{i=1}^{n} X_i$ and $\rightsquigarrow$ means convergence in distribution, which in this case is equivalent to the pointwise convergence of the cdf of $Z_n$ to the cdf of a $N(0,1)$.

Suppose moreover that $X_n$ is absolutely continuous for all $n$‘s. Then $Z_n$ is absolutely continuous for all $n$‘s. Are there examples of such sequences $(X_n)$, such that the pdf of $Z_n$ does not converge pointwise to the pdf of a $N(0,1)$, not even almost everywhere (w.r.t. the Lebesgue measure on $\mathbb R$)?

3

There are 3 best solutions below

2
On BEST ANSWER

This counterexample is from the book Limit Distributions for Sums of Independent Random Variables by Gnedenko and Kolmogorov.

Let $X$ have density $\begin{cases} 0 &\text{if} |x|\geq \frac 1e \\ \frac{1}{2|x|\log^2(|x|)} &\text{if} |x|< \frac 1e \end{cases}$

The authors argue that $f_n$ the density of $\sum_{i=1}^n X_i$ verifies $\displaystyle f_n(x) > \frac{c_n}{|x \log^{n+1}(|x|)|}$ for some positive constant $c_n$ in a neighborhood of $0$. So the density of $Z_n$ (which is a normalized version of $f_n$) is infinite at $0$.


They prove the following theorem:

Theorem: Suppose $X$ has density $f$. If

  • for some $m\geq 1$, $f_m$ (the density of $\sum_{i=1}^m X_i$) is in $L^r(\mathbb R)$ for some $r\in (1,2]$,
  • $\int x^2 f(x) dx <\infty$ (i.e. $X$ has a second moment)

Then $\displaystyle \sup_{x\in \mathbb R} \left|\sigma \sqrt n f_n(\sigma \sqrt n x) - \frac{1}{\sqrt{2\pi}} e^{-x^2/2} \right| \xrightarrow[n\to \infty]{}0$

In Petrov's Sums of Independent Random Variables, the following theorem is stated:

Theorem: Let $(X_n)$ be a sequence of i.i.d r.v with mean zero and variance $\sigma^2$ and let $f_n$ denote the density of $Z_n$ (if it exists).

Then $\displaystyle \sup_{x\in \mathbb R} \left| f_n(x) - \frac{1}{\sqrt{2\pi}} e^{-x^2/2} \right| \xrightarrow[n\to \infty]{}0$ if and only if $f_n$ is bounded for some $n$.

In Shiryaev's Probability 2, the following Local Central Limit Theorem is stated:

Theorem: Let $(X_n)$ be a sequence of i.i.d r.v with mean zero and variance $\sigma^2$. If for some $r\geq 1$, $\int |\phi_{X_1}(t)|^r dt <\infty$, then $Z_n$ has a density $f_n$ such that $\displaystyle \sup_{x\in \mathbb R} \left| f_n(x) - \frac{1}{\sqrt{2\pi}} e^{-x^2/2} \right| \xrightarrow[n\to \infty]{}0$


Regarding almost sure convergence, you should have a look at Rao's A Limit Theorem for Densities.

0
On

I can recommend

A Few Counter Examples Useful in Teaching Central Limit Theorems., www.jstor.org/stable/24590348.

As well as chapter 17 of the great book Counterexamples in Probability by Jordan Stoyanov.

0
On

Let $X\sim\mathcal N(\mu,\sigma^2)$ and $Y=1/X$. Then $$ \frac{\frac{1}{n}\sum_{k=1}^nY_k-\mathsf E_\mathcal PY}{\pi f_X(0)}\overset{d}{\to}\operatorname{Cauchy(0,1)}, $$ where $\mathsf E_\mathcal PY$ denotes the cauchy principal value integral $$ \mathsf E_\mathcal PY:=\mathcal P\int_{-\infty}^\infty y f_Y(y)\,\mathrm dy, $$ and $\operatorname{Cauchy(0,1)}$ denotes the standard Cauchy distribution. For this reason, we say $Y$ lies within the domain of attraction of the Cauchy law as opposed to the normal (Gaussian) law which is used in the CLT.