Gaussian distributions - a question about convergence

69 Views Asked by At

Let $\mu_n$ be Gaussian distributions with mean $0$ and standard deviation $1/n$ and $f$ a function.

It may be true that if $\underset{\mathbb{R}}{\int} f \mu_n dx \rightarrow \underset{\mathbb{R}}{\int} f \delta_0dx$, then $\mu_n \rightarrow \delta_0$? Why yes or not? If yes, in what conditions?

$\underset{\mathbb{R}}{\int} f \mu_n dx \rightarrow \underset{\mathbb{R}}{\int} f \delta_0dx$ $\Leftrightarrow \underset{\mathbb{R}}{\int}f (\mu_n - \delta_0) dx \rightarrow 0 \Leftrightarrow \mu_n \rightarrow \delta_0$ ?

Thank you!

1

There are 1 best solutions below

0
On BEST ANSWER

Let $\mu_n$ be the gaussian distribution with mean $0$ and standard deviation $1/n$.

Then $\displaystyle\int_\mathbb R f(x) \mu_n(\mathrm dx) \to f(0)$ for every bounded measurable function $f$ continuous at $0$.