How to prove that white noise exists in $n$ dimensions

230 Views Asked by At

I am defining white noise $W$ over $\mathbb R^n$ as a probability distribution over tempered distributions with the following property:

$$D \sim W \implies \langle D, f \rangle \sim \mathcal N \left (0, \int f^2 \right)$$

for any test function $f : \mathbb R^n \rightarrow \mathbb R$.

How do you prove such a distribution exists for each $n$. For $n=1$, you can take the derivative of the wiener process, but I don't know what to do for other cases.

1

There are 1 best solutions below

4
On BEST ANSWER

The quickest construction is as follows.

With the convention $\mathbb{N}=\{0,1,2,\ldots\}$ and the notation $\langle \alpha\rangle=\sqrt{1+\alpha_1^2+\cdots+\alpha_n^2}$ for the Japanese bracket of a multiindex $\alpha$, let $s(\mathbb{N}^n)\subset l^2(\mathbb{N}^n)\subset\mathbb{R}^{\mathbb{N}^n}$ be the space of (multi)sequences $z=(z_\alpha)_{\alpha\in\mathbb{N}^n}$ of real numbers with fast decay, i.e., such that for all $k\in\mathbb{N}$, $$ \sup_{\alpha\in\mathbb{N}^n}\ \langle \alpha\rangle^k|z_\alpha|<\infty\ . $$ Let $s'(\mathbb{N}^n)$ be the dual space of multisequences of temperate (at most polynomial) growth, with the strong topology.

Using the Daniell-Kolmogorov Theorem, or Kakutani's Theorem for constructing infinite products of probability spaces, introduce the probability measure $\mu$ on $\mathbb{R}^{\mathbb{N}^n}$ which makes the components $z_\alpha$, iid standard $N(0,1)$ Gaussian variables. It is easy to see that $\mu$ gives the measurable subset $s'(\mathbb{N}^n)\subset \mathbb{R}^{\mathbb{N}^n}$ full measure. So you can define the restriction $\nu$ of $\mu$ to $s'(\mathbb{N}^n)$. Finally the white noise measure on $\mathscr{S}'(\mathbb{R}^n)$ is just the push-forward $\Gamma_{\ast}\nu$ by the isomorphism $\Gamma:s'(\mathbb{N}^n)\rightarrow \mathscr{S}'(\mathbb{R}^n)$ given by the basis of Hermite functions.

More precisely, recall that the standard Hermite polynomials are given by $$ H_m(x)=(-1)^m e^{x^2} \left(\frac{d}{dx}\right)^m e^{-x^2} $$ so for example $H_0(x) = 1$, $H_1(x) = 2x$, $H_2(x) = 4x^2 -2$, $H_3(x) = 8x^3 - 12x$, etc.

Now the 1d Hermite function is $$ h_m(x) = \pi^{-1/4} \,\,2^{-m/2}\,\, (m!)^{-1/2}\,\, e^{-x^2/2}\,\, H_m(x), $$ while the analogue in $n$ dimensions is $$ h_{\alpha}(x_1,\ldots,x_n)=h_{\alpha_1}(x_1)\cdots h_{\alpha_n}(x_n)\ . $$ The map $\Gamma$ is defined by $$ z=(z_\alpha)_{\alpha\in\mathbb{N}^n}\longmapsto \Gamma(z)=\sum_{\alpha\in\mathbb{N}^n}z_\alpha h_\alpha\ . $$ Behind this is the fact the $h_\alpha$ not only form an orthonormal basis for $L^2(\mathbb{R}^n)$, but also an unconditional Schauder basis for $\mathscr{S}(\mathbb{R}^n)$ as well as $\mathscr{S}'(\mathbb{R}^n)$.

For a reference about this important isomorphism see, e.g., "Distributions and Their Hermite Expansions" by Barry Simon (although a better proof is in his new comprehensive course in analysis).