Estimators in the case of refression with normally distributed errors

56 Views Asked by At

How can it be shown that the Maximum Likelihood Estimator and the Least Squares Estimator are equvalent in the case regression with normally distributed errors? Any help will be appreciated! Thanks in advance!

1

There are 1 best solutions below

0
On BEST ANSWER

Suppose that we have some points $y_{1},y_{2},\dots,y_{n}$ and we have the model $$y_{i}=x_{i}^{T}\beta+\xi_{i}$$ where the $\xi_{i}$ are i.i.d. $\mathcal{N}(0,\sigma^{2})$ noise terms, $\beta\in\mathbb{R}^{p}$ and $x_{i}\in\mathbb{R}^{p}$. This can be rewritten as $y=X^{T}\beta+\xi$ where the columns of $X$ are $x_{1},\dots,x_{n}$ and $\xi$ is a vector of noise. Now, the probability of $y$ given $\beta$ is $$P(y:\beta)=\prod_{i=1}^n P(y_i:\beta)=\prod_{i=1}^{n}\left(\frac{1}{\sqrt{2\pi}\sigma}\exp\left(-\frac{(y_{i}-x_{i}^{T}\beta)^{2}}{2\sigma^{2}}\right)\right)=\frac{1}{(2\pi)^{n/2}\sigma^{n}}\exp\left(-\frac{\|y-X^{T}\beta\|_{2}^{2}}{2\sigma^{2}}\right).$$ Since the exponent on the right hand side is $\|y-X^{T}\beta\|_{2}^{2}$ which is the sum of squares error, we see that the probability is maximized when this quantity is minimized. Consequently, $\beta$ is the same for both the $MLE$ and linear regression.