Convergence of the central limit theorem: pointwise vs uniform convergence

233 Views Asked by At

Let $X_1, X_2, \dots $ be a sequence of iid real random variables with finite moments (assume mean $\mu$ and variance $\sigma^2$) and call $X=\frac1N \sum_i X_i$.

The central limit theorem tells us that the distribution of $Z=\sqrt{N}(X-\mu)/\sigma$ converges in distribution to the normal law $\mathcal{N}(0,1)$. This means that the cumulative distribution function, $F_N(z)=P_N(Z\leq z)$ converges to that of the normal $\Phi(z)$, $$ \lim_{N\to \infty} F_N(z)\to \Phi(z). $$ By the Berry-Esseen theorem, we know that this convergence is uniform in $z$.

My question concerns the convergence of the probability density itself $P_N(Z=z)$ (and not the cumulative distribution function). Assuming that $P_N$ converges, it will converge to the Gaussian. But how does it converge? Only pointwise or can it be uniformly, and what are the conditions?

For instance, if the $X_i$ are uniformly distributed in $[0,1]$, it is not hard to convince oneself that the convergence is uniform, as the difference between $P_N$ and a Gaussian is of order $1/N^{3/2}$ at most. On the other hand, if the $X_i$ are exponential random variables, then the convergence is only pointwise, and there is some kind of Gibbs-like phenomenon where the distance to a Gaussian is always of order $0.18$ for some $z$ for any $N$, see for example the figures here.

What are the criterions to discriminate between the two convergences here?