The operator equation $Tf=g$ is said to be totally well-posed where $T:H_1\to H_2$ if for each $g\in H_2$.(H_1 and H_2 are Hilbert spaces.)
The equation has a unique solution and this solution depends continuously on both the "data" $g$ and the "model" operator $T$.
It's clear to me that when $A$ is invertible, the equation $Ax=b$ is totally well-posed. However, I have no idea why if we replace the notion of "solution" by "least squares solution of minimal norm", the matrix equation is only well-posed.(Not totally well-posed). I appreciate any hint.
Given $\mathcal{A} \in\mathbb{C}^{m\times n}$, the linear system $$ \mathbf{A} x = b $$ we have existence of a least squares solution when $b\notin\mathcal{N}(\mathbf{A}^{*})$. The least squares minimizers are $$ x_{LS} = \mathbf{A}^{\dagger} b + \left( \mathbf{I}_{n} - \mathbf{A}^{\dagger}\mathbf{A}\right) z, \quad z \in \mathbb{C}^{n}. $$ The minimizers are an affine space, represented by the dashed red line in the figure below.
When $\mathcal{N}\left( \mathbf{A}\right)$ is trivial, the least squares solution is $$ x_{LS} = \mathbf{A}^{\dagger} b = \left( \mathbf{A}^{*} \mathbf{A} \right)^{-1} \mathbf{A}^{*} b. $$ When the nullspace is trivial, $\mathcal{N}\left( \mathbf{A}\right) = \mathbf{0}$, and $\mathbf{A}^{\dagger} \mathbf{A} = \mathbf{I}_{n}$, and the least squares solution is unique. (Fredholm alternative.)