Finding of $\hat{\theta}_{MLE}$ of $f(x; \theta) = (\theta + 1)x^\theta$

4.2k Views Asked by At

Let $X_1, \cdots, X_n$ be a random sample from the PDF: $f(x;\theta) = (\theta + 1) x^{\theta}$ with $0<x<1$ and $\theta > -1$.

The likelihood function is: \begin{align} L(\theta) &= f(x_1, \cdots, x_n; \theta) \mathbb{1}\{0<x<1\} \mathbb{1}\{\theta >-1\}\\ &= \prod_{i=1}^{n}{f(x_i ; \theta)}\mathbb{1}\{0<x<1\} \mathbb{1}\{\theta >-1\}\\ &= \prod_{i=1}^{n}{(\theta+1)x_i^{\theta}}\mathbb{1}\{0<x<1\} \mathbb{1}\{\theta >-1\} \\ &=(\theta+1)^n \left( \prod_{i=1}^n{x_i}\right)^\theta \end{align}

We look at the log likelihood $l(\theta)$ and take the derivative of this since it is easier to deal with and it is allowed because the log is monotonic:

\begin{align} l(\theta) = n\log(\theta+1) + \theta \left( \sum_{i=1}^n{\log(x_i)} \right) \end{align}

$\implies \frac{d}{d\theta} = \frac{n}{\theta+1} + \sum_{i=1}^n{\log(x_i)} = 0$

$\implies \hat{\theta}_{MLE} = -\frac{n}{\sum_{i=1}^n{\log(x_i)}} - 1$

However, this doesn't look right.

2

There are 2 best solutions below

0
On BEST ANSWER

Note that because each $x_i \in (0,1)$, then $\log x_i \in (-\infty, 0)$, hence $$-\frac{n}{\sum \log x_i} \in (0, \infty).$$ It follows that $\hat \theta_{MLE} \in (-1, \infty)$, as desired. There is no issue.

Also note that your computation of the likelihood should be more precisely written as $$\mathcal L(\theta \mid \boldsymbol x) = \prod_{i=1}^n (\theta+1) x_i^\theta \mathbb 1 (0 < \color{red}{x_i} < 1) \mathbb 1 (\theta > -1).$$ This in turn is equivalent to $$\mathcal L (\theta \mid \boldsymbol x) = (\theta+1)^n \left(\prod_{i=1}^n x_i \right)^\theta \mathbb 1(0 < x_{(1)}) \mathbb 1(x_{(n)} > 1) \mathbb 1 (\theta > -1).$$

3
On

Note that you parametric space is $\Theta = [-1,\infty)$, so your computed MLE should be in that space, otherwise $\arg \max_{\theta\in \Theta}\mathcal{L}(\theta;X)=-1$, i.e., $$ \hat{\theta}_{MLE} = \max\{-1, -\frac{n}{\sum log(x_i)}-1 \} $$