My computer vision class is covering linear least squares problems with SVD.
In our notes, the problem statement is as follows:
min $\mid Ax - b\mid^2 = min(Ax - b)^T(Ax - b)$
= min $x^TA^TAx - 2b^TAx - b^Tb$
= min $x^TA^TAx - 2b^TAx$
However, I do not follow this logic. Wouldn't the initial product expand as follows:
min $(Ax - b)^T(Ax - b)$ = min $(x^TA^T - b^T)(Ax - b)$
= min $x^TA^TAx - x^TA^Tb - b^TAx + b^Tb$
I am also struggling to understand why $b^Tb$ is removed from the equation.
Firstly, $b^T b$ is a known quantity, so it is sufficient to minimize (according to your thoughts) $$x^T A^T A x - x^TA^Tb-b^T Ax.$$
But $x^TA^T b = (Ax)^T b = \langle Ax, b\rangle = b^T (Ax)$. Actually, $(Ax)^T b$ is the inner product of $Ax$ and $b$.
Thus, what we want to minimize is: $$x^T A^T Ax - 2b^T Ax. $$