Time for first disk failure which is exponentially distributed

354 Views Asked by At

This is not homework, but exam preparation.

Suppose disk $A$ fails in an exponential distribution with parameter $\lambda$ (meaning, in average it fails every $\frac{1}{\lambda}$ time), and disk $B$ fails in an exponential distribution with parameter $\mu$. Disks $A$ and $B$ are independent.

I am asked what is the average time it takes for the first failure to happen. I tackled the problem in the following way:

The probability both disks fail up until time $t$ is $P(A\leq t)P(B\leq t)$, which has distribution of $(1-e^{-\lambda t})(1-e^{-\mu t})$. The probability both disks survive past time $t$ is $P(A \geq t)P(B \geq t) = (1-P(A\leq t))(1-P(B\leq t))=e^{-\lambda t}e^{-\mu t}$. So, the probability of the first failure is distributed $1 - (1-e^{-\lambda t})(1-e^{-\mu t}) - e^{-\lambda t}e^{-\mu t}$, and the average time it takes to happen is just the expected value calculation of that distribution

However, I see that a student before me tried to solve it this way, and the grader said that this is completely wrong with no elaboration.

What is wrong in this way?

1

There are 1 best solutions below

0
On BEST ANSWER

You are supposed to find the mean of the distribution of $\min(A,B)$.

$$P(\min(A,B) > t) = P(A>t)P(B > t) = (1-P(A \leq t))(1-P(B \leq t)) = e^{-\lambda t}e^{-\mu t}$$

$$f_{\min(A,B)}(t) = \frac{d P(\min(A,B)\leq t)}{dt} = (\lambda+\mu)e^{-(\lambda+\mu)t}$$

$$E(\min(A,B)) = \frac{1}{\lambda+\mu}$$