Maximum likelihood estimate - help with calculating logs!

98 Views Asked by At

I'm doing a practice question finding the maximum likelihood estimate, but I'm having a bit of trouble with the actual 'pure' maths bit of it (the differentiation)

I don't understand how you go from equation 1 to 3?

If anyone could help I'd really appreciate it?enter image description here

2

There are 2 best solutions below

0
On BEST ANSWER

To get from $(1)$ to $(2)$, You need to know that $\dfrac{\partial}{\partial\sigma} \log\sigma = \dfrac 1 \sigma$ and $\dfrac{\partial}{\partial\sigma} \, \dfrac{-1}{2\sigma^2} = \dfrac{-1}{2}\,\dfrac{\partial}{\partial\sigma} \sigma^{-2} = \dfrac{-1}{2}(-2)\sigma^{-3} = \dfrac{1}{\sigma^3}$.

To get from $(1)$ to $(3)$: \begin{align} & \frac{\partial}{\partial\theta} \left( -n\log\sigma - \frac n 2 \log(2\pi) - \frac{1}{2\sigma^2} \sum_{i=1}^n (y_i - g(x_i))^2 \right) \\[10pt] = {} & 0 + 0 - \frac{1}{2\sigma^2} \frac{\partial}{\partial\theta} (y_i - g(x_i))^2. \end{align}

Then $$ \frac{\partial}{\partial\theta} (y_i-g(x_i))^2 = 2(y_i-g(x_i)\frac{\partial}{\partial\theta}(y_i - g(x_i)). $$

The we have $$ \frac{\partial}{\partial\theta} y_i - \frac{\partial}{\partial\theta} g(x_i) = 0 - \frac{\partial}{\partial\theta} g(x_i). $$

That is what was done. One can only presume that the fact that $g(x_i)$ depends on $\theta$ and $y_i$ does not was assumed to be already known to the reader.

0
On

If we differentiate the RHS of equation 1 wrt to $\theta$, then the first two terms are zero. Inside the sum for the last term, we have by the chain rule, $\frac{\partial (y_i - g(x_i))^2}{\partial \theta} = -2(y_i - g(x_i)) \frac{\partial g(x_i)}{\partial \theta}$. (Here, I used the fact that $g$ is written as a function of $\theta$ as it says in your explanation.)

Hence, this gives, $-\frac{1}{2\sigma^2} \sum -2(y_i - g(x_i)) \frac{\partial g(x_i)}{\partial \theta} = \frac{1}{\sigma^2} \sum (y_i - g(x_i)) \frac{\partial g(x_i)}{\partial \theta}$.