Finding the MLE of $P(x\leq y)$

119 Views Asked by At

I have a similar question to the one posted here: Maximum likelihood estimator of $P(X < y)$ for fixed $y$. However, the answer here seems quite vague and I can't seem to get any conclusions from it.

Could someone explain this better or post a solution? thanks

1

There are 1 best solutions below

2
On BEST ANSWER

Working with the example in your link, suppose $X_1,...,X_n$ are iid, each with density $$f_X(x;\theta)=(1+\theta)x^\theta,\quad x\in [0,1].$$

We have likelihood $$L(x_1,...,x_n;\theta)=\Pi_i f_X(x_i;\theta)=(1+\theta)^n\Pi_ix_i^\theta.$$

We also have $G(y;\theta)\equiv P(X_i\leq y)=y^{1+\theta }$ for fixed $y\in(0,1)$. We wish to find an MLE for $G(y;\theta).$

Now if the solution is an interior one where the first order condition is met, then you could appeal to the chain rule to obtain first order condition with respect to $G(y;\theta)$:

$$0=\partial_{G(y;\theta)} \log L(x_1,...,x_n;\theta)=\frac{\partial_{\theta} \log L(x_1,...,x_n;\theta)}{\partial_{\theta} G(y;\theta)}.$$

Note $G(y;\theta)$ is monotonically strictly decreasing in $\theta.$ So the FOC here is the same as FOC for an MLE for $\theta$:

$$0=\partial_{\theta} \log L(x_1,...,x_n;\theta)\implies \hat\theta =-\left(\frac{1}{\frac{1}{n}\sum_i\log x_i}+1\right),$$

giving an MLE for $G(y;\theta)$ of $G(y;\hat\theta)$.