I am reading my textbook and find a weird phenomenon. The example says that Anne and Betty enter a beauty parlor simultaneously. Anne to get a manicure and Betty to get a haircut. Suppose the time for a manicure (haircut) is exponentially distributed with mean 20 (30) minutes. I wanna calculate the expected time until the first customer completes service.
Calculation in the text:
Since the total service rate is $1/30+1/20=5/60$, the time until the first customer completes service should be exponential with mean $60/5=12$ minutes.
This sounds unreasonable, because ideally the expected waiting time until the first customer is done should be between the average waiting time of each. Namely, I think the correct answer should be between 20 minutes and 30 minutes. How can we expect the first customer to be done in 12 minutes when each has to take at least 20 minutes on average?
No, that does not look reasonable. We are accepting the smaller time, so we should expect to wait less.
Let $A$ ($B$) be the time to complete Anne's (Betty's) service. Then we are interested in $X = min(A,B)$. Then, because $X \le A$ , we must have $E(X) \le E(A)$