square-root rule of time

516 Views Asked by At

I tried to test the square-root-rule of time for quantiles of a normal distribution.

So i created with the statiscal programming language R two variables

a<-rnorm(100,mean=2,sd=1)

b<-rnorm(100,mean=2,sd=1)

According to the square-root-rule of time there must be:

quantile(a+b,0.1)=quantile((a,b),0.1)*sqrt(2)

But this is not the case. Why not?

1

There are 1 best solutions below

0
On BEST ANSWER

Here's why what you did didn't work. Let $X,Y$ be two independent normally distributed random variables with mean 2 and variance 1 (this is what you sampled from with your R code). This means that $X+Y$ is normally distributed with mean 4 and variance 2, which implies that $$ \frac{X+Y - 4}{\sqrt{2}}$$ is a standard normal random variable, so it follows that $$ P(X+Y \leq \sqrt{2}z_{0.1} + 4) = 0.1 $$ where $z_{0.1}$ is the 0.1 quantile of the standard normal random variable. On the other hand we have $$P(X\leq z_{0.1} + 2) = 0.1 $$ so the quantiles are not scalings of each other by $\sqrt{2}$, because the quantile is linear in the mean but not in the variance.