I'm searching for an easy proof for this theorem: (Given $A$ and $b$) If $Ax=b$ has a solution for $x$, then this solution = the least squares solution.
This is how I did it , but I'm not sure everything is correct:
(Part where I hesitate if it's correct, I want to proof $A$ is invertible)
let $A$ be a $m \times n$ matrix , $X_0$ the solution vector for $Ax=B$ in $\mathbb{R}^n$ and $B$ a vector in $\mathbb{R}^m$.
Because $Ax=B$ has a solution $X_0$, rank of $A = n$. Since $A$'s rank $= n$ , $A$ is invertible.
(Part where I'm pretty sure of)
The least squares solution is given by the formula $(A^t\cdot A)^{-1}\cdot A^t\cdot B$
Since $A$ is invertible, we can say that $A^t$ is also invertible. This means we can use the formula $(A^t.A)^{-1} = A^{-1}.(A^t)^{-1}$
The least squares solution then becomes $A^{-1}\cdot(A^t)^{-1}\cdot A^t\cdot B = A^{-1}\cdot B = X_0 $
As pointed out in the comments, if $A$ is not a square matrix, then $A$ isn't invertible, which is the flaw in your proof.
As an alternate route, note that if $Ax = b$ has a solution $x = x_0$, then by definition, $Ax_0 = b$, i.e. $Ax_0 - b = \vec{0}$. Hence, $\|Ax_0-b\|_2 = 0$.
Now, can you show that there are no vectors $x$ such that $\|Ax-b\|_2 < 0$? If you can, then $\|Ax-b\|_2 \ge 0 = \|Ax_0-b\|_2$ for all vectors $x$, and thus, $x = x_0$ is a minimizer of $\|Ax-b\|_2$.