In $Ax=b$, if I look $x$ as a projection of $A$ onto the subspace that $x$ spans and $b$ is the projected result, then is it possible to calculate some sort of "orthogonal complement" to the subspace that $x$ spans? Formally, is it possible to calculate a matrix $M$ such that $M^\top x = \vec{0}$?
This question was raised from my recent thinking of SVD. Suppose we have $X=U\Sigma V^\top$ as SVD formulation. We usually calculate $U$ and $V$ naturally by eigendecompositions. However, if I want to specify the eigenvectors that correspond to the largest singular value, namely $u_1$ and $v_1$ to some predefined value instead of the naturally calculated ones, is it possible to sort out the rest of vectors in $U$ and $V$ according to the predefined $u_1$ and $v_1$ while preserving the setup $X=U\Sigma V^\top$?
I want to use some iterative methods to calculate the eigenvectors that correspond to the second/third largest singular values, with the largest eigenvectors $u_1$ and $v_1$ predefined in some way instead of the actual ones. Does this idea make sense?
Thank you for your help!