Self-Study, Minimal Sufficient Statistic, MLE, Beta, Correct Argument

96 Views Asked by At

The following example is taken from Hogg Introduction to Mathematical Statistics 7e and the exercise is to show that the MLE is a minimal sufficient statistic. I am not 100% sure about my argument.

Given: $X_1, \dots, X_n \overset{iid}{\sim} Beta(1, \theta)$

  1. Deriving the ML-Estimator:

$$ f_X(x) = \frac{ \Gamma(1+\theta) }{\Gamma(1)\Gamma(\theta)} x^0(1 - x)^{\theta -1} = \frac{\theta!}{(\theta-1)!}(1 - x)^{\theta -1} = \theta(1 - x)^{\theta -1} \quad x \in (0,1] $$

Likelihood function (let $I := {1,\dots, n})$: \begin{align*} L(\theta|x_{i \in I}) &= \prod_{i \in I} \theta\cdot(1 - x_i)^{\theta-1} = \theta^n\prod_{i \in I}(1 - x_i)^{\theta-1} \\ \Rightarrow logL(\theta|x_{i \in I})&= n log(\theta) + (\theta - 1)\sum_{i \in I}log(1-x_i) \rightarrow max! \\ &\Rightarrow\frac{n}{\theta} + \sum_{i \in I}log(1-x_i) \overset{!}{=} 0 \\ &\Rightarrow \hat{\theta}_{ML} = -\frac{1}{n \sum_{i \in I}log(1-x_i)} \end{align*}

which will be positive since $log(x) \leq 0 \quad \forall x \in (0, 1)$.

Factorizing:

\begin{align*} L(\theta|X_{i \in I}) &= \prod_{i \in I} \theta\cdot(1 - x_i)^{\theta-1} = \theta^n\prod_{i \in I}(1 - x_i)^{\theta-1} = \theta^n\prod_{i \in I}(1 - x_i)^{\theta} \prod_{i \in I}(1 - x_i)^{-1} \\ &\Rightarrow log(L(\theta|X_{i \in I})) =nlog(\theta)+ \theta\sum_{i \in I}log(1 - x_i) - \sum_{i \in I}log(1 - x_i) \end{align*}

My argument:

Thus by the factorization theorem $(\sum_{i \in I}log(1-x_i))$ is a sufficient statistic. Since $\hat{\theta}$ is a function of the sufficient statistic, it is in fact a minimal sufficient statistic (without proof).