Let $(X_1, Y_1),\ldots, (X_n, Y_n)$ be a random sample of size $n$ from the continuous distribution with joint pdf:
$$f_{X, Y} (x, y\mid\theta) = \frac{1}{\theta y}\exp\left(\frac{-x}{\theta y}\right)I_{(0,\infty)}(x) I_{(0,1)}(y) I_{(0,\infty)}(\theta)$$
(a) Find a complete and sufficient statistic for $\theta$.
(b) Find the maximum likelihood estimator (MLE) for $\theta$.
(c) Find the MLE for $P(X < Y)$.
(d) Let $V= \frac{X}{\theta}$ and $W = Y$. Show that the joint distribution of $(V, W)$ does not depend on $\theta$.
(e) Define the statistic $S((X_1, Y_1),\ldots,(X_n, Y_n))$ by:
$$S((X_1, Y_1),\ldots,(X_n, Y_n)) = \frac{\sum\limits_{i=1}^n X_i Y_i} {\sum\limits_{j=1}^n X_j}$$
Show that $S((X_1, Y_1),\ldots,(X_n, Y_n))$ is an ancillary statistic for the model $f_{X,Y}(x,y\mid \theta)$. Note that you cannot claim the it is part of the scale parameter family when the pdf is a joint distribution. Thus, to show that S is ancillary, I will have to show that its distribution does not depend on $\theta$.
I have completed parts a, b, and d, but I am stuck on how to start the last two parts. Any help would be greatly appreciated. I thought for part c to utilize the invariance property of MLE's, but I wasn't sure how to go about using it. Then for part e, I have no idea how to start it.
I'm pretty sure the reason part (d) is there is that you're supposed to do part (e) by using the result of part (d). Notice that $\dfrac{\sum_i V_i W_i}{\sum_i V_i}= \dfrac{\sum_i X_i Y_i}{\sum_i X_i}$.
For part (c), you need to find a function of $\theta$: $$ g(\theta) = \Pr(X<Y). $$ If you have the MLE $\hat\theta$ of $\theta$, then $g(\hat\theta)$ is the MLE for $\Pr(X<Y)$.
After a hasty computation I get $\Pr(X<Y)= 1 - e^{-1/\theta}$, so its MLE should be $1-e^{-1/\hat\theta}$.
....and for the MLE for $\theta$ I get $\displaystyle\frac1n\sum_{i=1}^n \dfrac{X_i}{Y_i}$, so the MLE for $\Pr(X<Y)$ should be $$ 1 - \exp\left( \frac{-n}{\sum_{i=1}^n (X_i/Y_i)} \right). $$
Later addendum: The joint distribution of $(X,Y)$ is \begin{align} & \phantom{{}=} \frac1y\exp\left(\frac{-x}{\theta y}\right)\,\frac{dx}{\theta}\, dy\qquad\text{on }x>0,\ 0<y<1. \\[8pt] & = \frac1y\exp\left(\frac {-v}{y}\right)\,dv\,dy\qquad\text{on }v>0,\ 0<y<1. \end{align} We want $\Pr(X<Y)$. That is $$ \int_0^1\int_0^y\frac1y\exp\left(\frac{-x}{\theta y}\right)\,\frac{dx}{\theta}\, dy. $$ In the inner integral, $x$ goes from $0$ to $y$. Then \begin{align} & \phantom{{}=} \int_0^1 \int_0^y\frac1y\exp\left(\frac{-x}{\theta y}\right)\,\frac{dx}{\theta}\, dy = \int_0^1 \int_0^{y/\theta} \frac1y\exp\left(\frac {-v}y\right) \, dv\,dy \\[8pt] & = \int_0^\infty \left[ -\exp\left(\frac {-v}y\right)\right]_{v=0}^{v=y/\theta} \, dy = \int_0^1 (1 - e^{-1/\theta}) \, dy = 1 - e^{-1/\theta}. \end{align}
Second later addendum: The likelihood function is $$ L(\theta) = \prod_{i=1}^n \frac{1}{\theta y_i}\exp\left(\frac{-x_i}{\theta y_i}\right) = \frac{1}{\theta^n \prod_{i=1}^n y_i}\exp\left( \frac{-1}\theta \sum_{i=1}^n \frac{x_i}{y_i} \right) $$ So $$ \ell(\theta)=\log L(\theta) = -n\log\theta -\frac1\theta\sum_{i=1}^n\frac{x_i}{y_i}+\text{constant} $$ and then $$ \ell\;'(\theta) = \frac{-n}\theta + \frac{1}{\theta^2}\sum_{i=1}^n\frac{x_i}{y_i} =\frac{1}{\theta^2}\left( -n\theta+\sum_{i=1}^n\frac{x_i}{y_i} \right). $$ Finally we get $$ \begin{cases} \ell\;'(\theta) > 0 & \text{if } 0\le\theta<\frac1n\sum_{i=1}^n\frac{x_i}{y_i}, \\[10pt] \ell\;'(\theta) = 0 & \text{if } \theta=\frac1n\sum_{i=1}^n\frac{x_i}{y_i}, \\[10pt] \ell\;'(\theta) < 0 & \text{if } \theta>\frac1n\sum_{i=1}^n\frac{x_i}{y_i}. \end{cases} $$