I want to solve the following exercise in "Statistical Models" by Davison:
A family has two children A and B. Child A catches an infectious disease D which is so rare that the probability that B catches it other than from A can be ignored. Child A is infectious for a time $U$ having probability density function $\alpha \exp(-\alpha u)$ , $u \geq 0$, and in any small interval of time $[t, t + δt]$ in $[0, U)$, B will catch D from A with probability $βδt + o(δt)$, where $α, β > 0$. Calculate the probability $ρ$ that B does catch D. Show that, in a family where B is actually infected, the density function of the time to infection is $γ\exp(-γt)$ , $t\geq 0$, where $γ = α + β$.
For simplicity we can ignore $o(δt)$. If we let $T$ denote the infection time of B, then $\rho=P(T < \infty) = P(T \leq U)$ and conditioning version of law of total probability, I have computed that $\rho=\beta/(\alpha+\beta)$. To arrive at this I have further assumed to infection in disjoint intervals are independent.
I pressume that the random variable to be considered in the second part of the question is $T \mid T < \infty$ and hence using $\rho$ from before we just need to compute the marginal probability $P(T \leq t)$. But I cannot see how this becomes $\rho$ times the CDF of an Exp($\alpha + \beta$). Also I have been toying with the idea of viewing the probability of the minimum of two independent exponential distributions but I cannot convince myself of which events one wants to consider. One of these must surely be $U$. But conditioned on $T < \infty$, is it not true that $T \sim T \wedge U$?
If $T \in [0, U] \cup \{\infty\}$ denotes the infection time of $B$, we have$$\mathrm P(T \in [t, t + \delta t] \mid \min(U, T) \ge t) = \beta \delta t + o(\delta t),$$ and likewise $$\mathrm P(\min(U, T) \in [t, t+\delta t] \mid \min(U, T) \ge t) = (\alpha + \beta)\delta t + o(\delta t).$$
We deduce that $\min(U, T) \sim \mathrm{Exp}(\alpha + \beta)$, so $$\begin{align*} \mathrm P(T \in [t, t + \delta t]) &= \mathrm P(\min(U, T) \ge t)\ \mathrm P(T \in [t, t + \delta t] \mid \min(U, T) \ge t)\\ &=\exp(-(\alpha+\beta) t)\ (\beta \delta t + o(\delta t))\\ &=\frac{\beta}{\alpha + \beta}(\alpha + \beta)\exp(-(\alpha + \beta)t)\delta t + o(\delta t).\end{align*}$$ We can then conclude that $\mathrm P(T < \infty) = \beta / (\alpha + \beta)$, and $T \mid T < \infty \sim \mathrm{Exp}(\alpha + \beta).$