Convergence in Kolmogorov distance

235 Views Asked by At

Sequence of random variables $\left\{X_n\right\}_{n=1}^\infty$ converge to random variable $X$ in Kolmogorov distance if $$\lim\limits_{n\to \infty}\left(\sup\limits_{x\in\mathbb{R}}|F_n(x)-G(x)|\right)=0,$$ where $F_n$ is distributive function of random variable $X_n$ and $G$ is distributive function of random variable $X$.

I know that convergence in Kolmogorov distance implies convergence in distribution (converge weakly, or converge in law). Also I know that convergence in distribution does not imply convergence in Kolmogorov distance. Somewhere I have been reading that convergence in distribution imply convergence in Kolmogorov distance when distribution function $G$ is absolutely continuous.

But I cannot find example when convergence in distribution is hold but convergence in Kolmogorov distance is not. Any help will be appreciated. Thanks

1

There are 1 best solutions below

0
On

This is not really about convergence of random variables, but rather convergence in functions. Convergence in distribution is essentially point-wise convergence, whereas convergence in Kolmogorov distance is uniform convergence.

For the example, consider $G(x)$ to be the Heaviside step function (the CDF of the zero random variable), and let $F_{n}(x)$ be the CDF of the normal distribution with mean zero and variance $1/n$, which essentially looks like a logistic curve.

Then $F_{n}(x)$ converges pointwise to $G(x)$. This is clear for all $x \not = 0$. For $x = 0$, we can either define $G(0) = 1/2$ or just ignore it, since we don't actually need pointwise convergence for points where $G$ is discontinuous.

In contrast, not matter how large we choose $n$,

$$ \sup_{x} |F_{n}(x) - G(x)| = 1/2$$

since for $x$ slightly greater than zero, $G(x) = 1$ yet $F_{n}(x) \approx 1/2$. Therefore $F_{n}(x)$ does not converge to $G(x)$ in Kolmogorov distance.