Likelihood Estimator and Log Likelihood for a piece wise function

631 Views Asked by At

Essentially, my understanding of likelihood estimators is very weak, I've spent a while trying to get my head around it but the examples are all so different from each other that it's hard to find a clear method.

If someone could point me in the right direction for this question I'd very much appreciate it.

Question: Let $X$ be a random variable with parameter $\beta > 0$ and probability density function

$f_X(x)=\{ \frac{2^\beta\beta}{x^{\beta+1}},\quad x\geq2 \\ 0,\quad otherwise\\$

Let $x_1,x_2,...,x_n$ be a random sample from this distribution. Show that the log-likelihood for $\beta$ is

$ \ell(\beta,\textbf{x})=n\beta\log_e2+n\log_e\beta-(\beta+1)\sum\limits_{i=1}^n \log_ex_i $

and hence find the maximum likelihood estimator for $\beta$ and Fisher's information for $\beta$.

1

There are 1 best solutions below

9
On BEST ANSWER

I just had to study that for an exam and I think I might be able to explain it to you, just the way I understood it. I am assuming you understand why people are interested in estimators;

You got yourself a sample $x = (x_1, \cdots, x_n) $. Note that each $x_i$ comes from the same distribution function, which is that $f_X(x) $. If you make the parameter $\beta $ change, the function $f_X(x) $ will start giving different values for the same input right? That makes sense: changing $f $'s formula would ought to make it a different function.

You want to find the value $\beta_0$ which is the most likely value for $\beta $ that would have produced your sample $x = (x_1, \cdots, x_n) $. That is, $\beta $ can have many values but some values are more likely to have produced your sample than others. Therefore, you want to maximize the probability that a given value for $\beta$ produced that sample!

An analogy: suppose there is some distribution function for a random variable $Y $, $f_Y(t) = k $ for some parameter $k $ you ate to estimate. Also, assume you have a sample which is $y = (3, 3, 3, 3) $. Well, you can pick any value for $k $; any number will do; but given that you know that sample $y $, it makes more sense to believe that $k = 3$ because then $f_Y(t) $ would produce your sample, than to believe that $k = -378$. If it were, how on earth did you get that sample? Thus it is not that likely that $k = -378$. That is why you write $\ell(\beta | x) $! "Find the most likely value of $\beta $ given that it must produce $x $".

So we want to maximize the probability of some $n $-sized sample coming out as the one you got. Let us say $s$ is an $n $-sized sample. You want to maximize

$$P(s = x) $$

But $P(s = x) = \prod_{i=1}^n P(s_i = x_i) $ given that all $x_i $ are independent. Now you rewrite those probabilities with the actual formula to yield

$$P(s = x) = \prod_{i=1}^n \frac{2^\beta\beta}{x_i^{\beta + 1}} $$

That product is simplifiable to

$$P(s = x) = \frac{\beta^n 2^{n\beta}}{(\prod x_i)^{\beta + 1}} $$

Now you can take the $\log$ of that, since the $\log$'s maximums will be the same as this function.

Can you then manipulate the $\log $ to achieve the desired result?