Maximum likehood estimate

99 Views Asked by At

I have a question regarding the correction of my exercise:

Exercise 6. Let $Y_1,\dots,Y_n$ be i.i.d. such that $Y_i$ equals $1$ with probability $p$ and $-1$ with probability $1-p$, for all $i\in[n]$.

(a) Find an estimator of $p$ using the method of moments.

(b) What is the MSE of your estimator in (a)?

Recall: the MSE of an esimator $\hat{\theta}$ for a parameter $\theta$ is $\mathbb{E}(\hat{\theta}-\theta)^2$.

(c) Find an estimator of $p$ using the maximum likelihood method.

Solution.

(a) (1 pt) Correct answer: $\hat{p}=\frac{\bar{y}+1}{2}$.

$\bar{y}=\frac{1}{n}\sum_{i=1}^ny_i=\mathbb{E}[Y_i]=2p-1$

(b) (1 pt) Correct answer: $\mathrm{MSE}=\frac{p(1-p)}{n}$.

$\mathrm{MSE}(\hat{p})=\operatorname{Var}(\hat{p})+\operatorname{bias}(\hat{p})^2$, with $\operatorname{bias}(\hat{p})^2=0$ and $\operatorname{Var}(\hat{p})=\frac{p(1-p)}{n}$.

(c) (1 pt) Correct answer: $\hat{p}=\frac{\bar{y}+1}{2}$.

The log-likelihood is given by $l(p)=n\bar{y}\log(p)+\frac{1}{2}(n-n\bar{y})(\log(p)+\log(1-p))$. Solve $l'(p)=0$ and $l''(p)<0$.

I don't understand how to find the expression of $L(p)$. I have learned the following in class:

Let $y_1,\dots, y_n$ be a random sample from a density $f(y;\theta)$. The likelihood for $\theta$ is defined as $$L(\theta) = f(y_1,\dots, y_n; \theta) = f(y_1; \theta)\cdots f(y_n; \theta)$$ since all samples independent.

But for the probability $p$ (aka $\theta$ here), I really don't see it.

1

There are 1 best solutions below

0
On

I'll do it in what I consider a much easier way than the provided solution (they are really the same, but I get simpler expressions). I'll try to be somewhat detailed so you can see what's going on.

We are given that

$$\mathbb{P}(Y_i=k)=\begin{cases}p, & k=1,\\ 1-p,&k=-1.\end{cases}$$

In particular, this means that

$$f_p(k)=\begin{cases}p, & k=1,\\ 1-p,&k=-1.\end{cases}$$

We suppose now that we are given a sample $y_1,\dots,y_n\in\{1,-1\}$. Let $m$ denote the number of these which equal $1$. Then we have that

$$L(p)=\prod_{j=1}^nf_p(y_j)=\prod_{j=1}^mf_p(1)\prod_{i=1}^{n-m}f_p(-1)=p^m(1-p)^{n-m}.$$

This gives the log-likelihood function as

$$\ell(p)=m\ln p+(n-m)\ln(1-p).$$

Differentiating we have

$$\ell'(p)=\frac{m}{p}-\frac{n-m}{1-p}=\frac{m-np}{p(1-p)},$$

which is zero precisely when $p=\frac{m}{n}$. Furthermore, as

$$\ell''\left(\frac{m}{n}\right)=-\frac{m}{\left(\frac{m}{n}\right)^2}-\frac{n-m}{\left(1-\frac{m}{n}\right)^2}<0,$$

this is the unique maximum of the function. In particular this means that the MLE is given by

$$\hat{p}=\frac{m}{n}.$$

To see that this is the same as the provided answer, simply notice that

$$\bar{y}=\frac{1}{n}\sum_{j=1}^ny_j=\frac{1}{n}\left(\sum_{\substack{j=1\\ y_j=1}}^ny_j+\sum_{\substack{j=1\\ y_j=-1}}^ny_j\right)=\frac{1}{n}(m-(n-m))=\frac{2m-n}{n},$$

and so

$$\frac{\bar{y}+1}{2}=\frac{\frac{2m-n}{n}+1}{2}=\frac{m}{n}=\hat{p}.$$