Simple question on random variables and statistics

186 Views Asked by At

Let X1 and X2 be 2 random variables. X1 = 20. X2 = 30. Each of those has a standard deviation of 5. If the random variables were normally distributed, what is the probability of getting such a discrepancy in random variables?

How can I solve that? Some hints or references to simple resources would be appreciated so that i dont need to read some 500-page volume on statistics.

1

There are 1 best solutions below

0
On BEST ANSWER

This may be relevant. If random variables $X_1$ and $X_2$ are independent normal, with variances $\sigma_1^2$ and $\sigma_2^2$, then $X_1-X_2$ has normal distribution, with mean $E(X_1)-E(X_2)$ and variance $\sigma^2=\sigma_1^2+\sigma_2^2$. (Note that the random variables are not, respectively, identically $20$ and $30$: they vary, and from the estimated standard deviation, vary by quite a bit.)

In our case, we assume the means are the same, that is, that each measurement is unbiased. We also assume that $\sigma_i^2=25$. So the mean of $X_1-X_2$ is $0$, and $\sigma^2=50$. Unfortunately, neither assumption can be justified. But let's go on.

We want to know the probability that a normal with mean $0$ and standard deviation $\sqrt{50}$ has absolute value greater than $10$. This is the probability that a standard normal has absolute value greater than $\frac{10}{\sqrt{50}}=\sqrt{2}$. A table of the standard normal gives that this probability is $0.157$. But one should not take the calculation as giving any more than a ballpark estimate, since the standard deviation of $5$ is unlikely to be any more than an educated guess.