Finding the maximum likelihood estimator for statistics

131 Views Asked by At

So I worked out how to find the maximum likelihood estimator of

$f(x, \alpha) = \alpha x^{-\alpha-1}$ for $x > 0,$ where $\theta > 0$ is a parameter is:

$$L(\alpha) = \prod_{i=1}^n \alpha x_i^{-\alpha-1} = \alpha^n \prod_{i=1}^n x_i^{-\alpha-1}$$

Then:

$$\log L(\alpha) = n\log(\alpha) + \sum_{i=1}^n (-\alpha-1) \log(x_i)$$

So:

$$n\log(\alpha) + (-\alpha-1) \sum_{i=1}^n \log(x_i)$$

However I'm not sure how to find the likelihood estimator for

$$f(x;\theta) = \frac{1}{2(\theta x)^{1/2}} \text{ for } 0<x\le\theta$$

Any Advice would be very appreciated!

2

There are 2 best solutions below

1
On

Note that for the second problem if $\theta < x$, then $f(x,\theta)=0$. In particular $$ \sup_{\theta>0}f(x;\theta)=\sup_{\theta\geq x} f(x,\theta)=\frac{1}{2\sqrt{x}}\sup_{\theta\geq x} \frac{1}{\sqrt{\theta}}=\frac{1}{2\sqrt{x}}\frac{1}{\sqrt{x}}=f(x,x) $$ Thus $\hat{\theta}(x)= x$ for a sample of size $1$. For a sample of size $n$, a similar approach can be taken. In fact in that case $\hat{\theta}(\mathbf{x})=\max(x_1,\dotsc,x_n)$.

6
On

\begin{align} L(\theta) & = \left.\begin{cases} \displaystyle \prod_{i=1}^n \frac 1 {2(\theta x_i)^{1/2}} & \text{if } \theta \ge \text{all of }x_1,\ldots,x_n \\[8pt] 0 & \text{otherwise} \end{cases}\right\} \\[15pt] & = \left. \begin{cases} \displaystyle \frac 1 {2^n\theta^{n/2}} \cdot \frac 1 {\prod_{i=1}^n x_i^{1/2}} & \text{if } \theta\ge\max\{x_1,\ldots,x_n\} \\[8pt] 0 & \text{otherwise} \end{cases} \right\}. \end{align}

As $\theta$ decreases, $L(\theta)$ increases until $\theta$ gets down to $\max\{x_1,\ldots,x_n\}.$ So the MLE for $\theta$ is $\max\{x_1,\ldots,x_n\}.$