Network blackouts occur at an average rate of 5 blackouts per month. Assuming a suitable continuous-time counting process,
a. Compute the probability of more than 3 blackouts during a given month. b. Each blackout costs $1500 for computer assistance and repair. Find the expected monthly cost due to blackouts. c. Compute the standard deviation of the monthly cost due to blackouts.
What I've done so far: A- .735 B- 7500 C- I tried taking the square root of B but thats not the answer
The usual model is Poisson, parameter $\lambda=5$. The probability of more than $3$ blackouts is $1$ minus the probability of $\le 3$ blackouts. The probability of $\le 3$ blackouts is $$e^{-5}\left(1+\frac{5}{1!}+\frac{5^2}{2!}++\frac{5^3}{3!}\right).$$
The mean monthly cost is indeed $7500$.
If $X$ is the number of blackouts, then the cost $Y$ is $1500X$. The standard deviation of $Y$ is $1500$ times the standard deviation of $X$. The variance of the Poisson with parameter $\lambda$ is $\lambda$, so $X$ has standard deviation $\sqrt{5}$.