I recently posted a question about finding the partial derivatives of a function $L[\phi]$. This was Problem 2.1 from Simon Prince's "Understanding Deep Learning". However, my motivation for doing so was actually because I was confused about the next problem:
Show that we can find the minimum of $L[\phi]$ in closed form by setting the expression for the derivatives of $L[\phi]$ to zero and solving for $\phi_0$ and $\phi_1$.
$L[\phi] = \sum_{i=1}^I (\phi_0 + \phi_1x_i - y_i)^2$
$\frac{\partial L}{\partial\phi_0} = 2 \sum_{i=1}^I (\phi_0 + \phi_1x_i - y_i)$
$\frac{\partial L}{\partial\phi_1} = 2 \sum_{i=1}^I (\phi_0 + \phi_1x_i - y_i)x_i$
I set both expressions equal to each other and solved for $\phi_0, \ \phi_1$. My attempt led me to the following:
$\phi_0 = \sum_{i=1}^I \frac{\phi_1x_i^2 -y_ix_i +y_i}{x_i +1}$
$\phi_1 = \sum_{i=1}^I \frac{\phi_0 x_i - y_ix_i + y_i - \phi_0}{x^2_i + x_i}$
Doesn't feel correct, can anyone advise? Thanks.