How to prove Gaussian integral in normal distribution can be scaled to a standard curve?

1.9k Views Asked by At

If I want to solve the gaussian integral for normal distribution problems I only need to scale it to a standard normal distribution curve and consult a table. I want to know why this is valid (the fact that it can be scaled to a standard curve). I want to know why the integral only depends on how many standard deviations I depart from the mean.

My textbook just says "using a result of calculus this can be proven", but I don't seem to find the proof anywhere.

Thanks.

1

There are 1 best solutions below

6
On BEST ANSWER

Details depend on how the general normal is defined. Let us suppose that we define it as follows. We say that $X$ has normal distribution with parameters $\mu$ and $\sigma$ if $X$ has probability density function $$f_X(x)=\frac{1}{\sqrt{2\pi}\,\sigma}e^{-(x-\mu)^2/(2\sigma^2)}.$$ We want to find $\Pr(X\le x)$. This is given by $$\Pr(X\le x)=\int_{-\infty}^x \frac{1}{\sqrt{2\pi}\,\sigma}e^{-(t-\mu)^2/(2\sigma^2)}\,dt.\tag{1}$$ Make the substitution $z=\frac{t-\mu}{\sigma}$. Then $dt=\sigma\,dz$. As $t$ ranges from $-\infty$ to $x$, $z$ ranges from $-\infty$ to $(x-\mu)/\sigma$. Substituting in the integral (1), we get $$\Pr(X\le x)=\int_{-\infty}^{(x-\mu)/\sigma} \frac{1}{\sqrt{2\pi}}e^{-z^2/2}\,dz.$$