calculate expectation of MLE

350 Views Asked by At

It's a question about whether $\hat{\theta _{MLE}}$ is an unbiased estimator of $\theta$.

n independent pairs $(X_{1},Y_{1}), (X_{2},Y_{2}),....(X_{n},Y_{n}), n\geq 3$,

where $Y_{i}=\theta X_{i}+\epsilon _{i}, i=1,2,...,n. \theta \in \mathbb{R}$

and $X_{i}'s$ and $\epsilon _{i}'s$ are iid random variables with N(0,1,) distribution.

My thought:

the likelihood function is $L(X,Y;\theta )=(2\pi )^{-n}e^{\frac{-1}{2}\sum x_{i}^{2}-\frac{1}{2}\sum {(Y_{i}-\theta X_{i})}^{2}}$

and the log-likelihood function is $l(X,Y;\theta )=-nln(2\pi ) -\frac{1}{2}\sum x_{i}^{2}-\frac{1}{2}\sum {(Y_{i}-\theta X_{i})}^{2}$

$\hat{\theta _{MLE}}=\frac{\sum x_{i}y_{i}}{\sum x_{i}^{2}}$. I think the MLE should be correct.

To calculate its expected value, I first have MLE simplified as:

$\hat{\theta _{MLE}}=\frac{\sum x_{i}y_{i}}{\sum x_{i}^{2}}=\frac{\sum x_{i}y_{i}}{\sum x_{i}^{2}}=\frac{\sum x_{i}(\theta x_{i}+\epsilon _{i})}{\sum x_{i}^{2}}=\theta +\frac{\sum x_{i}\epsilon _{i}}{\sum x_{i}^{2}}$

$E(\hat{\theta _{MLE}})=E(\frac{\sum x_{i}y_{i}}{\sum x_{i}^{2}})=\theta +E(\frac{\sum x_{i}\epsilon _{i}}{\sum x_{i}^{2}})$.

And I got stuck here, got no clue to continue the work.

Thanks for helping me out.

1

There are 1 best solutions below

1
On BEST ANSWER

Since $\epsilon_i$ is independent of the family $(x_j)$ and $E(\epsilon_i)=0$, $$E\left(\frac{x_{i}\epsilon _{i}}{\sum\limits_j x_j^{2}}\right)=E\left(\frac{x_{i}}{\sum\limits_j x_{j}^{2}}\right)\cdot E(\epsilon_i)=0.$$ Sum this over $i$.