Find an estimator of $\theta$ using method of moments and maximum likelihood

500 Views Asked by At

A random sample of $Y_1, Y_2, ... , Y_n$ is drawn from a population that is distributed according to the density function below, where parameter $\theta \gt 0.$

$f(y)= \frac{1}{2\theta}e^{\frac{-\lvert y\rvert}{\theta}}$ for $-\infty \lt y \lt \infty$

(a) Approximate the probability $P(\sum_{i=1}^n Y_i \gt \sqrt{n})$ for $\theta=4$, assuming that n is sufficiently large.

(b) Find an estimator of $\theta$ by the method of moments.

(c) Find an estimator of $\theta$ by the method of maximum likelihood.

(d) Find an unbiased estimator of $\theta$ and verify the unbiasedness. Is it consistent? Verify your answer.

What I have tried so far:

I mostly need help with (c) and (d), but I will post all of the work I have so far.

(a) First I wanted to set up the equation for $\bar{Y}$, then find the expected value and variance of $Y$, apply the CLT, and then make $\bar{Y}$ standard normal.

$P(\sum_{i=1}^nY_i \gt \sqrt{n})$ = $P(n\bar{Y} \gt \sqrt{n})$ = $P(\bar{Y} \gt \frac{1}{\sqrt{n}})$

Now I wanted to find $E(Y)$ and $V(Y)$ so I could apply the CLT.

$E(Y) = \int_{-\infty}^0 \frac{1}{8}ye^{\frac{y}{4}}$ + $\int_0^{\infty}\frac{1}{8}ye^{\frac{y}{4}} = 0$

$E(Y^2) = \int_{-\infty}^0 \frac{1}{8}y^2e^{\frac{y}{4}}$ + $\int_0^{\infty}\frac{1}{8}y^2e^{\frac{y}{4}} = 32$

$V(Y) = E(Y^2) - [E(Y)]^2 = 32$

So $Y$ ~ population distribution with $\mu=0$ and $\sigma^2 = 32$

and $\bar{Y}$~ population distribution with $\mu=0$ and $\sigma^2=\frac{32}{n}$

So by CLT, normal approximation applies.

$P(Z \gt \frac{\frac{1}{\sqrt{n}}-0}{\sqrt{\frac{32}{\sqrt{n}}}}) = P(Z \gt \frac{1}{\sqrt{32}}) = 1 - P(Z \le \frac{1}{\sqrt{32}}) = 0.42984$

(b) For b, I just found the moments and solved for my parameter. The first moment was equal to $0$, so I had to use the second moment. So I ended up getting:

$\mu_2^{'} = E(Y^2) = \int_{-\infty}^0 \frac{1}{2\theta}y^2e^{\frac{y}{\theta}}$ + $\int_0^{\infty}\frac{1}{2\theta}y^2e^{\frac{y}{\theta}} = 2\theta^2$

So $\theta=\sqrt{\frac{\mu_2^{'}}{2}}$ and $\hat{\theta}=\sqrt{\frac{m_2^{'}}{2}}$

(c) This is where I started to run into trouble. I understand the process behind maximum likelihood, but I'm not entirely certain on how to apply it with this absolute value. I was thinking of two different methods. Either splitting the function into two pieces, or just keeping the absolute value and dealing with it later.

So either:

$f(y)= \frac{1}{2\theta}e^{\frac{y}{\theta}}$ for $-\infty \lt y \lt 0$ + $\frac{1}{2\theta}e^{\frac{-y}{\theta}}$ for $0 \lt y \lt \infty$

Or just follow the route of $f(y_1)f(y_2)...f(y_n)$ = $(\frac{1}{2\theta})^ne^{\frac{1}{\theta}\sum_{i=1}^n \lvert y\rvert}$, then take the natural log, then the derivative, set it equal to 0, and find the max?

(d) I am also running into problems here. I want an unbiased estimator of this distribution. I know that since I already found $\mu=0$, if I use $ \bar{Y}$, I will get a biased estimator. I'm not too sure how to adjust this though to get it to be unbiased.

I feel somewhat confident on (a) and (b), but I'm stuck on (c) and (d). Any direction would be greatly appreciated.