Why is the normal distribution defined exactly the way it is defined?

184 Views Asked by At

Why is the standard deviation $\displaystyle\sigma$ defined in such a way that in the exponent of the normal distribution,

$\displaystyle f{{\left({x}\right)}}=\frac{1}{{\sigma\sqrt{{{2}\pi}}}}{e}^{{-{\left(\frac{{{x}-\mu}}{{\sigma\sqrt{{{2}}}}}\right)}^{2}}}$

$\displaystyle \sigma$ needs to be scaled up by an additional factor of $\displaystyle\sqrt{{{2}}}$?

Because intuitively, I would define the normal distribution like this, namely simply as the normalized Gaussian integral:

$\displaystyle {\int_{{-\infty}}^{{+\infty}}}{e}^{{-{x}^{2}}}{\left.{d}{x}\right.}=\sqrt{{\pi}}\quad\Rightarrow\quad\displaystyle\ f{{\left({x}\right)}}:\:=\frac{1}{\sqrt{{\pi}}}{e}^{{-{x}^{2}}}$

2

There are 2 best solutions below

1
On

The form $\frac{x^2}{2}$ is rather common in mathematics, often arising from the fact that it's the integral of $x$. For example the formula for kinetic energy is $m\frac{v^2}{2}$, the distance fallen in time $t$ is $g\frac{t^2}{2}$, and the Taylor expansion of $\exp(x)$ is $1+x+\frac{x^2}{2}\dots$. So we shouldn't be afraid when we see

$$\frac{1}{\sqrt{2\pi}\sigma}\exp\left(\frac{\left(\frac{x-\mu}{\sigma}\right)^2}{2}\right).$$

The expression $\frac{x-\mu}{\sigma}$ is x after being 'normalised' by subtracting off the mean and scaling by the standard deviation, and then squaring and dividing by two is a very standard thing to do.

3
On

Because the normal distribution is a probability density function. Therefore the integral from $-\infty$ to $+\infty$ has to equal one.

$$\displaystyle {\int_{{-\infty}}^{{+\infty}}}\frac{1}{{\sigma\sqrt{{{2}\pi}}}}{e}^{{-{\left(\frac{{{x}-\mu}}{{\sigma\sqrt{{{2}}}}}\right)}^{2}}} = 1 $$