What is the optimal solution for $\beta_1$ and $\beta_2$ in the following normal equation:
$$\beta _{ 1 }\sum _{ i=1 }^{ n }{ { x }_{ i } } +\beta _{ 0 }=\sum _{ i=1 }^{ n }{ { y }_{ i } } $$
EDIT
Suppose you are given a set of data $(x_i,y_i)$ with y = $\beta_1x+\beta_0$ and $\beta_1,\beta_0 \in \Bbb R$ are the parameters we want to determine.
A good criteria to find parameter values is to find $\beta_1$ and $\beta_0$ such that the residual sum of squares is minimum. In other words, we find the
values $\beta_0$, $\beta_1$ that minimize:
$$\sum _{ i=1 }^{ n }{ ({ \beta }_{ 1 }{ x }_{ i }+{ \beta }_{ 0 }-y_{ i })^{ 2 } } $$ which I wrote as $$\left \|\ A\begin{pmatrix} \beta _{ 1 } \\ \beta _{ 0 } \end{pmatrix}-b\ \right \|^2$$ with $A=\begin{pmatrix} { x }_{ 1 } & 1 \\ { x }_{ 2 } & 1 \\ ... & ... \\ { x }_{ n } & 1 \end{pmatrix}$ and $b = \begin{pmatrix} y_{ 1 } \\ { y }_{ 2 } \\ ... \\ y_{ n } \end{pmatrix}$
From there, you can extract the normal equation
$$\beta _{ 1 }\sum _{ i=1 }^{ n }{ { x }_{ i } } +\beta _{ 0 }=\sum _{ i=1 }^{ n }{ { y }_{ i } } $$
Now equation is, how do you find the optimal solution for $\beta_1$ and $\beta_0$?
The normal equation $(A^TA)\vec{\beta} = A^T\vec{y}$ actually gives you two equations:
$\displaystyle\underbrace{\begin{pmatrix}\sum_{i=1}^{n}x_i^2 & \sum_{i=1}^{n}x_i \\ \sum_{i=1}^{n}x_i & n\end{pmatrix}}_{A^TA} \underbrace{\begin{pmatrix} \beta_1 \\ \beta_0 \end{pmatrix}}_{\vec{\beta}} = \underbrace{\begin{pmatrix} \sum_{i=1}^{n}x_iy_i \\ \sum_{i=1}^{n}y_i \end{pmatrix}}_{A^T\vec{y}}$
$\displaystyle\beta_1\sum_{i = 1}^{n}x_i^2 + \beta_0\sum_{i=1}^{n}x_i = \sum_{i=1}^{n}x_iy_i$ (1)
$\displaystyle\beta_1\sum_{i = 1}^{n}x_i + \beta_0n = \sum_{i=1}^{n}y_i$ (2)
Now, you have a two variable, two unknown system. You should be able to solve this.