Using Log Likelihood to Find Sufficient Statistic

512 Views Asked by At

So, I've been given the following problem from Wackerly's Mathematical Statistics with Applications (specifically 9.60). I'm aware of one way to find the solution, but I'd like to know if this works as well.

Let $Y_1 ... Y_n$ be a random sample from

$$f(y|\theta) = \begin{cases} \theta y^{\theta - 1}, & 0 < y < 1, & \theta > 1\\ 0 & \text{otherwise} \end{cases}$$

Show that $$\sum^n_{i=1} -\text{log}(y_i)$$ is sufficient for $\theta$.

The second part is showing that it is an uniformly minimum-variance unbiased estimator, but I'm not really that worried about that part. I'll go through my logic to showing it so far.

$$L(y_i) = \prod^n_{i=1}\theta y_i^{\theta -1} = \theta^n\prod_{i=1}^n y_i^{\theta - 1}$$ At this point, I can see that maybe I could use the log-likelihood function instead of the typical likelihood function. I'm just not sure if I can do this.

If we find the sufficient statistic by expressing our Likelihood function as $h(y)g(T(y))$ then our log likelihood would expressed as $h(x) + g(T(x))$. So, I have $$l(y_i) = n\text{log}(\theta) +\theta\sum^n_{i=1}\text{log}(y_i) - \sum^n_{i=1}\text{log}(y_i)$$ $$=n\text{log}(\theta) +(1-\theta)\big(-\sum^n_{i=1}\text{log}(y_i)\big)$$ So, our $h(y)$ could be the $n\text{log}(\theta)$ and the $g(T(y))$ could be $(1-\theta)\big(-\sum^n_{i=1}\text{log}(y_i)\big)$ where $T(y)$ is $\big(-\sum^n_{i=1}\text{log}(y_i)\big)$.

Basically, what I'm asking... Is this legitimate?