The random variables $X_1,...X_n$ are independent draws from continuous unifirm distribution with support $[0,\theta]$. Derive a method of moments and maximum likelihood estimators of $\theta$. Your research assistant has constructed new random variables $Y_i$ such that:
$$Y_i = \begin{cases} 0, & \text{if} \, X_i \le k \\ 1, & \text{if} \, X_i \gt k\\ \end{cases}$$
where $k$ is a constant chosen by RA and known to you. What are the Maximum Likelihood and Method of Moments Estimators in this case? Assume $(k \lt \theta)$.
After trying, these are the answers which I am getting:
first part: By method of moments: $\hat{\theta} = 2\bar{X}$
By MLE: $\hat{\theta} = Max\{X_i\}$
Second Part: By Method of Moments: $\hat{\theta} = \frac{k}{1-\overline{Y}}$
By MLE: $\hat{\theta} =\frac{k}{1-\overline{Y}}$ (same as above)
Please tell me which answer is wrong so that I can redo that very part.
PS: This question was asked in an exam a couple of years back, so I dont know the correct answers.
All seem right to me.
The first part is well known.
For the second part, you can consider that $Y$ is a Bernoulli variable with probability of success $p= 1- k/\theta$. Hence, because $k$ is known, both $p$ and $\theta$ can be regarded as (complete) parameters of $Y$, related by the above function $p=g(\theta)$. And, recalling the important functional invariance property of the ML estimator, we can compute $\hat{p}_{ML}$ (which, we probably already know, for a Bernoulli is just the sample mean $\bar{Y}$), and so
$$ \widehat\theta_{ML}=\frac{k}{1-\hat{p}_{ML}}=\frac{k}{1-\bar{Y}}$$