Context
For a given integer $N$, let $V$ be the vector space $\mathbb{C}^N$. Consider the linear equation
$$\mathbf{A} \mathbf{x} = \mathbf{b}$$
where, $\mathbf{x},\mathbf{b} \in \mathbb{C}^N$ and $\mathbf{A} \in \mathbb{C}^{N\times N}$ is a Hermitian matrix of full rank. $\mathbf{A}$ has $N$ unit eigenvectors $\mathbf{v}^{(n)}$, and so $V$ can be described with the orthonormal basis $B_V = \left\{\mathbf{v}^{(n)}\right\}_{n=1}^N$
Now let's define a subspace $S$ of $V$ whose orthonormal basis consists of only the first $M$ eigenvectors, where $M < N$:
$$B_S = \left\{\mathbf{v}^{(n)}\right\}_{n = 1}^M$$
It is then possible to define a projection matrix $\mathbf{P}_S$ that projects a vector $\mathbf{x} \in V$ into $\mathbf{x}_S \in S$:
$$\mathbf{P}_S = \sum_{n=1}^{M}\mathbf{v}^{(n)}\left(\mathbf{v}^{(n)}\right)^*, \qquad \mathbf{x}_S = \mathbf{P}_S \mathbf{x}$$
where $^*$ denotes the complex conjugate transpose.
There also is an orthogonal subspace $T$ to $S$ which has it's own basis
$$B_T = \left\{\mathbf{v}^{(n)}\right\}_{n = M+1}^N$$
and projection matrix
$$\mathbf{P}_T = \sum_{n=M+1}^{N}\mathbf{v}^{(n)}\left(\mathbf{v}^{(n)}\right)^*, \qquad \mathbf{x}_T = \mathbf{P}_T \mathbf{x}$$
It is possible to decompose any vector $\mathbf{x} \in V$ uniquely into $\mathbf{x}_S \in S$ and $\mathbf{x}_T \in T$
$$\mathbf{x} = \mathbf{x}_S + \mathbf{x}_T$$
In fact, since the orthonormal bases $B_S$ and $B_T$ are comprised of eigenvectors of $\mathbf{A}$, it is possible to project the entire equation $\mathbf{A}\mathbf{x} = \mathbf{b}$ into the subspaces, yielding
$$\mathbf{A}_S \mathbf{x}_S = \mathbf{b}_S, \qquad \text{where $\mathbf{A}_S = \mathbf{P}_S \mathbf{A} = \mathbf{A} \mathbf{P}_S$}$$
$$\mathbf{A}_T \mathbf{x}_T = \mathbf{b}_T, \qquad \text{where $\mathbf{A}_T = \mathbf{P}_T \mathbf{A} = \mathbf{A} \mathbf{P}_T$}$$
These matrix equations describe $M$ and $N-M$ independent equations respectively.
So, my aim is to take the linear system above, described in $\mathbb{C}^N$, and to decompose it into orthogonal subsystems that are individually simpler to analyse. In this question, I'm looking at decomposing the above system into two subsystems, one in $\mathbb{C}^M$ and the other in $\mathbb{C}^{N-M}$.
And applying a projection matrix to the matrix equation seems to be a step in the right direction, splitting the $N$ independent equations into $M$ and $N-M$. The issue is that, while each of the projected matrix equations in $S$ and $T$ describe $M$ and $N-M$ independent equations, both matrix equations still span (albeit a subspace of) $\mathbb{C}^N$, which the involved matrices being rank deficient.
What I really want to be doing is to perform, say, projection $S$ onto $\mathbf{A}\mathbf{x}=\mathbf{b}$, and then reduce the resulting equation in $S \subset \mathbb{C}^N$ to a matrix equation in vector space $\hat{S} = \mathbb{C}^M$ (where I use $\hat{}$ to denote this "reduced" space):
$$\mathbf{A} \mathbf{x} = \mathbf{b} \text{ (in $\mathbb{C}^N$)}\quad\xrightarrow{\mathbf{P}_S}\quad \mathbf{A}_S \mathbf{x}_S = \mathbf{b}_S \text{ (in $\mathbb{C}^N$)} \quad\xrightarrow{\text{"reduction"}}\quad \mathbf{A}_\hat{S} \mathbf{x}_\hat{S} = \mathbf{b}_\hat{S} \text{ (in $\mathbb{C}^M$)}$$
The question
My question really boils down to: is there a proper name to this "reduction" of a rank-deficient equation to a fully ranked one of lower dimension? Being able to put a name to something makes it infinitely easier to search for it! It feels to me like "projection" would also be an appropriate term for going from space $V$ to "reduced" subspace $\hat{S}$, as geometric projections from 3D to 2D $\mathbb{R}^3 \rightarrow \mathbb{R}^2$ would suggest, but this would seem to conflict/cause ambiguity with the previously mentioned projection operations $\mathbb{C}^N \rightarrow \mathbb{C}^N$.