Space-time white noise grows at infinity?

156 Views Asked by At

I am reading some introduction texts on SPDE's and I often find the phrase "noise grows at infinity" (for instance in Gubinelli & Hofmanova's paper on $\Phi^4$ https://arxiv.org/abs/1804.11253, page 4) what does this mean?

Referring to noise as space-time white noise which can formally be understood as a centred Gaussian process $\xi$ with covariance function $\mathbb{E}[\xi(t,x)\xi(s,y)]=\delta(t-s)\delta^d(x-y)$ how am I supposed to make sense of this blow-up?

Rigorously defined $\xi$ is not a (classical) function, but only a distribution so point-wise evaluation doesn't even make sense. However, say I were to interpret is as a function or I would perhaps look at its mollification how could I see that it grows at infinity? And does this mean spacial infinity or as time goes to infinity or both?

I would be very happy with some heuristics or a even a formal argument.

2

There are 2 best solutions below

0
On

One needs a suitable way to quantify the growth of a temperate Schwartz distribution $T$. We know the definition of being in $\mathscr{S}'$ somehow (in a rather mysterious way) says $T$ grows at most like a polynomial at infinity, i.e., like $|x|^{\alpha}$. The question is how to make this quantitative and in particular find the $\alpha$, or rather the infimum of $\alpha\in\mathbb{R}$ such $T$ "grows at most as $|x|^{\alpha}$". One way to find a good definition of this growth exponent is to do an inversion $x\mapsto x/|x|^2$ so this becomes an exponent for a singularity at a point (the origin) and then use for instance the Steinman scaling degree. See the article "On-shell extension of distributions" for a definition and use for extension of distributions. Finally, once you have the definition, aim for a Theorem of the form 1) if $\alpha>\alpha_0$, then with probability one, the random distribution $\xi$ grows at most like $|x|^{\alpha}$, and 2) if $\alpha<\alpha_0$ then the event $\xi$ grows at most like $|x|^{\alpha}$ has probability zero. This is a bit similar to well known results about Brownian motion being Holder $(\frac{1}{2})^{-}$.

0
On

A simple example: take a family of i.i.d. standard Gaussians $(g_n)_{n \ge 0}$. Each of them has variance $1$ but now $\sup_{n} g_n =+\infty$ almost surely because if you look far enough then you always see some $g_n$ bigger than all those you already saw. This is due to the fact that the support of the Gaussian is unbounded. Take $Q=\sum_{n} n^{-2}e^{\lambda g_n^2}$, then for $\lambda >0$ small you have simply $E[Q]<\infty$ so $Q<\infty$ almost surely. Therefore $g_n \le \lambda^{-1/2} \log^{1/2}[n^2Q]$ which shows that $g_n$ does not grow more than $\log^{1/2} n$. White noise on large scales behaves much like i.i.d. gaussians so you have similar estimates. Pathwise is unbounded: i.e. if you test it with something localized around a given point $x$, e.g. with $y \mapsto f(y-x)$ for some smooth and localized $f$ and then you send the point to infinity, then you will see larger and larger values.