Understanding why variance of the standard normal distribution equals one intuitively

1.3k Views Asked by At

Can anyone explain to me why the variance of the standard normal distribution is 1? I am trying to understand the mechanism behind standardising random variable. While I know minus the variable by the mean is like shifting the graph to make it centre at the origin, I don't know why dividing it by SD makes the variable having SD = 1 as well

2

There are 2 best solutions below

0
On

The variance of standard normal distribution is $1$ by definition.

Concerning standardizing: if $X$ has a distribution with standard deviation $\sigma_X\neq0$ or equivalently with variance $\sigma_X^2$ then for every constant $c$ (also $c=\mathbb EX$) we have $\mathsf{Var}\left(\frac{X-c}{\sigma_X}\right)=1$ according to the rule:$$\mathsf{Var}(aY+b)=a^2\mathsf{Var}Y$$

Can you deduce this rule yourself?

Applying it on $Y=\frac{X-c}{\sigma_X}$ we get: $$\mathsf{Var}(\sigma_X^{-1}X+(-\sigma_X^{-1}c))=\sigma_X^{-2}\mathsf{Var}X=\sigma_X^{-2}\sigma_X^{2}=1$$

This means that we can write $X=\sigma_XU+\mu_X$ where $U:=\frac{X-\mu_X}{\sigma_X}$ has mean $0$ and variance $1$.

0
On

Let $X\sim N(\mu,\sigma^2)$ and $Z=\frac{X-\mu}{\sigma}$, then $Z\sim N(0,1)$, because: $$\mathbb E\left(\frac{X-\mu}{\sigma}\right)=\frac{1}{\sigma}\cdot \mathbb E(X-\mu)=\frac1{\sigma}\cdot \mathbb E(X)-\frac{\mu}{\sigma}=0;\\ \sigma^2\left(\frac{X-\mu}{\sigma}\right)=\frac{1}{\sigma^2}\cdot \sigma^2(X-\mu)=\frac1{\sigma^2}\cdot \sigma^2(X)=1.$$