I want to calculate the maximum likelihood estimation of $P(x|\theta) = 1/(1-\theta)$ for $\theta<= x <=1$.
I end up with $log1 - nlog(1-\theta)$ and when I want to take the derivitie I end up with $-n*-1/(1-\theta)$ how should I proceed? Because I cannot set this equal to zero and get a value for $\theta$
I know this has some relations possibly with order statistics but I am not sure how to derive MLE for it.
When you cannot set derivative to zero, it means the maximum occurs at the boundary. Indeed, it is clear the maximum occurs at $\theta=\min x_i$ if you actually kept the indicator in your likelihood function: $$P(x\mid\theta)=(1-\theta)^{-1}1_{\theta\leq x\leq 1}$$ So with independent $x_1,\dots,x_n$, we get $$L(\theta)=L(\theta\mid x_1,\dots,x_n)=(1-\theta)^{-n}1_{\theta\leq\min x_i}1_{\max x_i\leq 1}$$ and hence the derivative $$ (\log L)'(\theta\mid x_1,\dots,x_n)= \begin{cases} \dfrac{n}{1-\theta}>0 & \theta<\min x_i\leq\max x_i\leq 1\\ 0 & \theta>\min x_i\text{ or }\max x_i>1 \end{cases} $$ Hence the maximum occurs at $\theta=\min x_i$ by left-continuity of our $L(\theta)$