Why minimizing mean-square error maximizes Gaussian likelihood

57 Views Asked by At

Can someone help me to understand this slide?

Minimizing mean-square error maximizes Gaussian likelihood $$f ( x | \theta ) = \frac { 1 } { \sqrt { 2 \pi \sigma ^ { 2 } } } e ^ { - \frac { ( \alpha - \mu ) ^ { 2 } } { 2 \sigma ^ { 2 } } }$$ $$\log f ( x | \theta ) = C _ { 1 } - \frac { ( x - \mu ) ^ { 2 } } { C _ { 2 } }$$ $$\arg \max _ { \theta } f ( x | \theta ) = \arg \min _ { \theta } ( x - \mu ) ^ { 2 }$$

I am particularly confused about $C_1$ and $C_2$. What are those?

1

There are 1 best solutions below

0
On BEST ANSWER

Take the given expression for $f(x \mid \theta)$ and take the logarithm of both sides. You will end up with an expression of the form $C_1 - \frac{(x-\mu)^2}{C_2}$ where you can explicitly write down what $C_1$ and $C_2$ are; since they are constants that depend only on $\sigma^2$ (they do not depend on $x$ or $\mu$), we do not care about their exact value when considering the likelihood maximization problem.