Total variation distance of two centered normal distributions

3.5k Views Asked by At

I need to prove that the total variation distance between two normal random variables $X_t \sim \mathcal{N}(0,t)$ and $X_s \sim \mathcal{N}(0,s)$ converges to $0$ when $s \nearrow t$.

We know that $$||X_t - X_s||_{\text{TV}}= \sup_{||f||_\infty \leq 1} \mathbb{E} (f(X_t)-f(X_s))$$ or that $$||X_t - X_s|| = \int_{-\infty}^\infty \bigg\vert\frac{e^{-\frac{x^2}{2t}}}{\sqrt{2\pi t}} - \frac{e^{-\frac{x^2}{2s}}}{\sqrt{2\pi s}} \bigg\vert dx$$ I try two work with the second identity but I didn't get something useful. For example we can use that $s < t$ and define $$x(s,t) = \sqrt{\frac{st}{t-s}\log\left(\frac{t}{s}\right)} $$ and then $$||X_t - X_s|| = \int_{-\infty}^{-x(s,t)} \left(\frac{e^{\frac{-x^2}{2t}}}{\sqrt{2\pi t}} - \frac{e^{\frac{-x^2}{2s}}}{\sqrt{2\pi s}} \right)dx + \int_{-x(s,t)}^{x(s,t)}\left(\frac{e^{\frac{-x^2}{2s}}}{\sqrt{2\pi s}} - \frac{e^{-\frac{x^2}{2t}}}{\sqrt{2\pi t}} \right)dx + \int_{x(s,t)}^{\infty} \left(\frac{e^{\frac{-x^2}{2t}}}{\sqrt{2\pi t}} - \frac{e^{\frac{-x^2}{2s}}}{\sqrt{2\pi s}} \right)dx$$. Then by symmetry we get $$||X_t -X_s||= 2\left( \int_{x(s,t)}^{\infty} \left(\frac{e^{\frac{-x^2}{2t}}}{\sqrt{2\pi t}} - \frac{e^{\frac{-x^2}{2s}}}{\sqrt{2\pi s}} \right)dx + \int_0^{x(s,t)}\left(\frac{e^{\frac{-x^2}{2s}}}{\sqrt{2\pi s}} - \frac{e^{-\frac{x^2}{2t}}}{\sqrt{2\pi t}} \right)dx \right).$$ I want to use this and some kind of convergence property of the integral to conclude but I don't know how to do it.

2

There are 2 best solutions below

2
On BEST ANSWER

Let $g_t(x)$ denote the PDF of $X_t$ at point $x$, then, by homogeneity, $\|X_t-X_s\|_{TV}=4d\left(\frac{s}t\right)$ where, for every $s$ in $(0,1)$, $$d(s)=\int_{x(s)}^\infty (g_1(x)-g_s(x))\,\mathrm dx,$$ and $x(s)$ is the point where the two densities coincide, that is, $x(s)$ is your $x(s,1)$. Equivalently, once again by homogeneity, $$d(s)=\int_{x(s)}^\infty g_1(x)\,\mathrm dx-\int_{x(s)/\sqrt{s}}^\infty g_1(x)\,\mathrm dx=\int_{x(s)}^{x(s)/\sqrt{s}}g_1(x)\,\mathrm dx.$$ The PDF $g_1$ is uniformly bounded by $g_1(0)$ and $x(s)\to1$ when $s\to1$, $s\lt1$, hence $$d(s)\leqslant g_1(0)x(s)\left(\frac1{\sqrt{s}}-1\right)\to0.$$

0
On

This can be obtained without calculations by invoking Scheffé's Lemma. Indeed, when $t\to s$ then the pdf of $X_t$ converges pointwise to that of $X_s$ (the normal pdf is continuous in the scale parameter). This implies the convergence in total variation.

Exact calculations are nice too, as they fit into the challenging problem of determining the rate of this convergence / comparing normal distributions in total variation (also of importance in high dimension).