Calculating percentile value from mean and standard deviation of a normal distribution

45.3k Views Asked by At

I have to write some code to calculate the 95th percentile from a databaset which is normally distributed. I am easily able to calculate the mean and the standard deviation, which define the distribution. However, from those two values alone, is it possible to determine the x value of the 95th percentile? If so, could someone help me with the mathematical formula, which I will then convert into code.

2

There are 2 best solutions below

1
On

For a normal distribution, if you have the mean and the standard deviation, then you can use the following to find the percentiles:

mean= 50th percentile

mean + sd = 84th percentile

mean +2sd = 97.5th percentile

Hope this helps!

P.S.: You could read about this more in detail here https://en.wikipedia.org/wiki/68%E2%80%9395%E2%80%9399.7_rule

1
On

You can find the percentile by taking the integral of the PDF (probability density function) from negative infinity to your target value (for the right-hand value).

Here is the PDF function for a standard distribution:

$$\frac{1}{\sqrt{2\pi\sigma^2}}\, e^{-\frac{(x - \mu)^2}{2 \sigma^2}}$$

So to find the percentile of 3 sigma on a standard normal distribution ($\sigma = 1$, $\mu=0$) you can solve the following integral: $$\int_{-\infty}^{3} \frac{1}{\sqrt{2\pi1^2}}\, e^{-\frac{(x - 1)^2}{(2) (1^2)}} = 0.99865...$$

More information can be found on the Wikipedia page.