Taken from a computer vision book: "to minimize the sum of the perpendicular distances between points and lines, we need to minimize $$ \sum_i (ax_i + by_i +c)^2$$ subject to $a^2 +b^2 =1$. Now using a Lagrangian multiplier $\lambda$, we have a solution if $$ \left( \begin{array}{ccc} \overline{x^2} & \overline{xy} & \overline{x} \\ \overline{xy} & \overline{y} & \overline{y} \\ \overline{x} & \overline{y} & 1 \end{array} \right)\left[ \begin{array}{c} a \\ b \\ c \end{array} \right] = \lambda \left[ \begin{array}{cc} 2a\\ 2b \\ 0 \end{array} \right]$$
How is the book getting these matrices?
Also, the notion is that $\overline{u} = \frac{\sum u_i}{k}$. (Yeah, I don't know what $k$ stands for. I can only assume this is an average.)
It goes onto say that $c = -a\overline{x} - b\overline{y}$, and that we can substitute this back to get the eigenvalue problem $$\left[ \begin{array}{cc} \overline{x^2} -\overline{x}~\overline{x} & \overline{xy} - \overline{x}\overline{y}\\ \overline{xy} - \overline{x}\overline{y} & \overline{y^2} - \overline{y} ~\overline{y} \\ \end{array} \right] \left[\begin{array}{cc} a\\ b \end{array} \right] = \mu \left[ \begin{array}{cc} a\\ b \end{array} \right].$$
I don't see what they substituted into, and how the answer is derived.
Take the function $$ F(a,b,c,\lambda)=\sum_i (ax_i+by_i+c)^2-\lambda(a^2+b^2-1) $$ so that \begin{align} \frac{\partial F}{\partial a}&=2\sum_i (ax_i+by_i+c)x_i-2\lambda a=0\\ \frac{\partial F}{\partial b}&=2\sum_i (ax_i+by_i+c)y_i-2\lambda b=0\\ \frac{\partial F}{\partial c}&=2\sum_i (ax_i+by_i+c)=0\\ \frac{\partial F}{\partial \lambda}&=-(a^2+b^2-1)=0 \end{align} or better \begin{align} &a\sum_i x_i^2 +b\sum_i x_iy_i+c\sum_i x_i=\lambda a\\ &a\sum_i x_iy_i+b\sum_i y_i^2 +c\sum_i y_i=\lambda b\\ &a\sum_i x_i +b\sum_i y_i +nc=0\\ &a^2+b^2=1 \end{align}