In a search of a way to calculate variance for risk management I found the following formula (n1, n2, n3 are data points):
avg = (n1+n2+n3)/3
variance = ((n1 - avg)^2 + (n2 - avg)^2 + (n3 - avg)^2) / (3-1)
Next, to calculate risk that occurs 5% of a time you do this:
risk5 = 1.65 * variance
And 1% of a time:
risk1 = 2.33 * variance
What interests me is how were these two values (1.65, 2.33) calculated. What is a general formula to find a risk that occurs p% of a time?