Suppose, given a matrix $\textbf{A} \in \mathbb{C}^{m \times n}$ and a vector $\textbf{b} \in \mathbb{C}^{n}$, I want to find the minimal norm solution of
$$\min_{\textbf{x}}\|\textbf{A}\textbf{x} - \textbf{b} \|_{2} $$
with the condition that $\textbf{A}^{*}\textbf{b} = \textbf{0}_{n}$.
If $rank(\textbf{A}) =n$, then the solution is
$$\textbf{y} = (\textbf{A}^{*}\textbf{A})^{-1}\textbf{A}^{*}\textbf{b}.$$
This would give $\textbf{y} = \textbf{0}$ with the stated condition. However, for $\textbf{A}$ of arbitrary rank, the the minimal norm solution for this problem is
$$\textbf{y} = \textbf{A}^{\dagger}\textbf{b}$$
Where $\textbf{A}^{\dagger}$ is the pseudoinverse.
Can we say anything about $\textbf{y}$ with stated condition in the general case, or is it arbitrary
It is known that a vector $\hat {\mathbf x}$ is a least squares solution of the system $A {\mathbf x}= {\mathbf b}$ iff $A^*A\hat {\mathbf x}=A^* {\mathbf b}.$
Thus, if the condition $A^* {\mathbf b}={\mathbf 0}$ is satisfied, then $\hat {\mathbf x}= {\mathbf 0}$ is a least squares solution. Obviously, it has the minimum possible norm (zero).