Importance of $e^{-x^2}$ in Gaussian distribution

562 Views Asked by At

I don't know if such question could be posed, anyway I'll try.

Let's consider the Gaussian distribution, and in particular the exponential factor (in its simplest form), $e^{-x^2}$. Its derivation, its importance via the Central limit theorem is perfectly sound to me, but conceptually I could not associate its specific form to its widespread importance: i.e. why the exponential decrease in probability density is quadratic, and not, say, $e^{-|x|}$? Why so much natural phenomena seem to obey to quadratic exponential decrease?

PS: If it could help, I've found this discussion on meta.math. I liked, in Terry Tao's answer, the link between the exponentiated version of a quadratic form and the Taylor expansion concept.

2

There are 2 best solutions below

1
On

I'm not sure if this actually helps, but maybe it would be easier to grasp intuitively going back to the simplest case, that of a fair coin? In particular, let's say we already know the peak probability that in $n$ flips we get exactly $\frac{n}{2}$ heads, we're only looking for its ratio to some $\frac{n}{2} \pm l$, which would be:

$$\frac{\frac{n}{2}}{\frac{n}{2}+1}\cdot\frac{\frac{n}{2}-1}{\frac{n}{2}+2}\cdot\cdots\cdot\frac{\frac{n}{2}-l+1}{\frac{n}{2}+l}$$

Now let's say that $l$ is much larger than one, and much less than $n$, in which case you can approximate the logarithm of this ratio as:

$$\int_0^l\left(\log\left(\frac{n}{2}-t\right) - \log\left(\frac{n}{2}+t\right)\right)dt = \int_0^l\left(\log\left(1-\frac{2t}{n}\right) - \log\left(1+\frac{2t}{n}\right)\right)dt$$ $$\approx-\int_0^l\frac{4t}{n}dt = -\frac{2l^2}{n}$$

All cases can ultimately be reduced to weighted iterations of this one, so it always ends up being $e^{-x^2}$.

2
On

This is an answer to a very old question, but I think it is a fairly interesting one. More precisely, this is trying to explain why the exponent of the Gaussian is $2$ and not some other number.

Consider a random walk $S_n = \sum_{i=1}^n X_i$ where the $X_i$ are of course independent and identically distributed, say with mean zero and finite variance. It is a standard fact that $\mathrm{Var}(S_n) = n \mathrm{Var}(X_1)$, and so the scale of fluctuation of $S_n$ is $n^{1/2}$. I assert that you should think of the $2$ in the Gaussian exponent as $\frac{1}{1-1/2}$ with the $1/2$ being the scale of fluctuation of $S_n$.

I'll illustrate it by using the $1/2$ fluctuation scale to give a lower bound of $\exp(-cx^2)$ on the probability that $S_n > xn^{1/2}$ for some $c>0$. By independence and the definition of $S_n$, \begin{align} \mathbb P\left(S_n > xn^{1/2}\right) \geq \prod_{i=1}^k \mathbb P\left(S_{n/k} > x n^{1/2}/k\right) \end{align}

(I ignore issues like $n/k$ not being an integer just to communicate the idea more clearly.)

Set $k = a^{-2}x^2$ for some small constant $a$. Now,

\begin{align} \mathbb P\left(S_{n/k} > x n^{1/2}/k\right) &= \mathbb P\left(S_{n/k} > xk^{-1/2} (n/k)^{1/2}\right)\\ &= \mathbb P\left(S_{n/k} > a (n/k)^{1/2}\right)\\ &\geq \delta, \end{align}

the last inequality for some $\delta>0$ because we know $S_{n/k}$ fluctuates on scale $(n/k)^{1/2}$. So overall, \begin{align} \mathbb P\left(S_n > xn^{1/2}\right) \geq \delta^k = \delta^{a^{-2}x^2} = \exp(-cx^2). \end{align}

You can get an upper bound too but you'll have to use some concentration inequalities where the feeling that the Gaussian is hiding just underneath is much stronger.

In essence, standard properties of variance say that $S_n$ fluctuates on scale $n^{1/2}$, and by using this property at smaller scales, along with additivity properties of $S_n$, you can derive the $2$ exponent, at least on the level of a lower bound. This is an indication that the same should be true for the scaling limit of $S_n$, i.e., the Gaussian.

You can check that if we had $\chi$ in place of $1/2$, you would get the exponent $1/(1-\chi)$.

I developed these ideas in an argument in the slightly different context of last passage percolation, which appears here: https://arxiv.org/abs/2007.03594