I am working on the following problem and the explanation was not clear to me, so I am seeking for help.
The following is the problem.
A fire occurs with a probability of 0.01. The damage Y given that a fire has occurred is uniformly distributed between 0 and 1 million. Find the standard deviation of Y. (Let X be the event where the fire occurs or not)
I understand the fact that the formula
$$Var[Y]=Var[E[Y|X]]+E[Var[Y|X]]$$
comes in handy in this situation.
However, the book I am looking at gives me a calculation
$$\frac{10^{12}}{4}(.01)(.99)+\frac{10^{12}}{2}(.01) $$
which puzzles me.
I can see that the right term comes from the variance of a uniform distribution, but I'm not quite clear how the left term came about.
My understanding is that the mean of a uniform is simply the midpoint of the interval, so We should get $\frac{10^6}{2}$ which is for some reason squared in this explanation.
So, I would like 2 things to be answered.
1), Where did the left term come from?
2), I used uniform [0,1] instead of uniform [0,1000000]. If I multiply my result with 1000000, will this still work?
You could simply use the fact that $E[Y] = E[E[Y|X]]$ and $E[Y^2] = E[E[Y^2|X]]$.
$$ E[Y|X] = \begin{cases} \frac{10^6}{2} & X = 1 \\ 0 & X = 0 \end{cases}$$
$$ E[Y^2|X] = \begin{cases} \frac{10^{12}}{3} & X = 1 \\ 0 & X = 0 \end{cases}$$
Both conditional expectations are random variables as they should be. The rest is just taking the expectation of both of these random variables and using the identity
$$Var(Y) = E[Y^2]-(E[Y])^2$$