Why does standardizing normal distributions preserve probabilities?

240 Views Asked by At

Normalizing a normal distribution to the standrad normal distribution is achieved by creating from a random variable $X$ a new random variable $X' = \frac{X - \mu}{\sigma}$, where $\mu$ is the mean of the distribution of $X$ and $\sigma$ is its standard deviation. I do understand why this does indeed yield the standard normal distribution. What I do not understand, is why $X'$ does have the same probabilities as $X$ or conversely, why all possible areas under the curve of $X'$ of given intervals on the x-axis stay the same as under the curve of $X$.

Sure, sliding the distribution around by subtracting $\mu$ does not alter the shape of the curve and thus all areas under the curve keep their original sizes. But why is that also true when stretchin gthe curve by dividing $X$ by $\sigma$?