Let $X$ and $Y$ be independent exponential random variables, with
$$f(x\mid\lambda)=\frac{1}{\lambda}\exp{\left(-\frac{x}{\lambda}\right)},\,x>0\,, \qquad f(y\mid\mu)=\frac{1}{\mu}\exp{\left(-\frac{y}{\mu}\right)},\,y>0$$
We observe $Z$ and $W$ with $Z=\min(X,Y)$, and $W=\begin{cases} 1 &,\text{if }Z=X\\ 0 &,\text{if }Z=Y \end{cases}$
I have obtained the joint distribution of $Z$ and $W$, i.e., $$P(Z \leq z, W=0)=\frac{\lambda}{\mu+\lambda}\left[1-\exp{\left(-\left(\frac{1}{\mu}+\frac{1}{\lambda}\right)z\right)}\right]$$
$$P(Z \leq z, W=1)=\frac{\mu}{\mu+\lambda}\left[1-\exp{\left(-\left(\frac{1}{\mu}+\frac{1}{\lambda}\right)z\right)}\right]$$
Now assume that $(Z_i,W_i),i=1,\cdots,n$, are $n$ i.i.d observations. Find the MLEs of $\lambda$ and $\mu$.
(This is the exercise 7.14 of the book Statistical Inference 2nd edition, but no solution given)
The $W=0$ and the $W=1$ case can be combined by writing $\lambda^{1-W}\mu^W$.
First differentiating w.r.t. $z$ and forming the product the likelihood can be written as $\prod_i^n{\frac{1}{\lambda^{w_i}\mu^{1-w_i}}}e^{-(\frac{1}{\lambda}+\frac{1}{\mu})z_i}$. Taking logs gives the log likelihood $=-\sum_i^n(w_i\ln{\lambda}+(1-w_i)\ln{\mu}+(\frac{1}{\lambda}+\frac{1}{\mu})z_i)$.
Maximising wrt $\lambda$ and $\mu$ gives $\lambda=\frac{\bar{z}}{\bar{w}}$ and $\mu=\frac{\bar{z}}{(1-\bar{w})}$
Note if $\lambda=\mu$ then $\bar{w}\approx\frac{1}{2}$ so the estimates for $\lambda$ and $\mu$ become equal.