first of all sorry if the title is not really specific but I thought about it and I didn't find something short which fits better my case.
My problem is the following.
I have a matrix $A\in\mathbb{R}^{12\times12}$ and I have a possible analytical expression of its Null-Space $\mathcal{N}(A)$ (The analytical expression is one of the infinite possible expressions).
I arrived to the point in which I can write:
$\mathcal{N}(A)=span\{ \mathbf{n_1}, \mathbf{n_2}, \mathbf{n_3} \}$
where:
- $\mathcal{N}(A)$ is a 5-dimensional subspace of $\mathbb{R}^{12}$ and the analytic expression $\{ \mathbf{n_1}, \mathbf{n_2}, \mathbf{n_3} \}$ is one (of the infinite) basis of the subspace of dimension 5.
Indeed I have:
- $\mathbf{n_1}\in\mathbb{R}^{12\times3}$
- $\mathbf{n_2}\in\mathbb{R}^{12\times1}$
- $\mathbf{n_3}\in\mathbb{R}^{12\times1}$
A possible basis of $\mathcal{N}(A)$ is the following:
$\begin{split} \mathcal{N}(A)&= span\left\{\left[\begin{array}{c} \mathbf{1_{N_3}} \\ \mathbf{0_{3\times3}}\end{array} \right],\, \left[\begin{array}{c} \mathbf{p} \\ \mathbf{0_{3\times1}}\end{array} \right], \left[\begin{array}{c} \mathbf{p}^{\perp} \\ \mathbf{1_N}\end{array} \right] \right\}\\ = span \left\{\left[ \begin{array}{c} 1\\0\\0\\1\\0\\0\\1\\0\\0\\0\\0\\0 \end{array} \begin{array}{c} 0\\1\\0\\0\\1\\0\\0\\1\\0\\0\\0\\0 \end{array} \begin{array}{c} 0\\0\\1\\0\\0\\1\\0\\0\\1\\0\\0\\0 \end{array} \right],\, \left[ \begin{array}{c} p_{1_x}\\ p_{1_y}\\ p_{1_z}\\ p_{2_x}\\ p_{2_y}\\ p_{2_z}\\ p_{3_x}\\ p_{3_y}\\ p_{3_z}\\ 0\\0\\0 \end{array} \right], \left[ \begin{array}{c} p^{\perp}_{1_x}\\ p^{\perp}_{1_y}\\ p^{\perp}_{1_z}\\ p^{\perp}_{2_x}\\ p^{\perp}_{2_y}\\ p^{\perp}_{2_z}\\ p^{\perp}_{3_x}\\ p^{\perp}_{3_y}\\ p^{\perp}_{3_z}\\ 1\\1\\1 \end{array} \right] \right\}\\ &=span\left\{\mathbf{n_1}, \mathbf{n_2}, \mathbf{n_3}\right\} \end{split}$
Where: $\mathbf{1_{N_3}}=\left[\begin{array}{c} 1\\1\\1 \end{array} \right]\otimes\mathbf{I}_{3\times3}\in\mathbb{R}^{9\times3}$
And $\mathbf{p}$ is the vector of positions of certain agents (it doesn't matter here I think ($\mathbf{N}$ is the number of agents, in this case $\mathbf{N}=3$).
$\mathbf{p}=\left[\begin{array}{c} \mathbf{p_{1_x}}\\\mathbf{p_{1_y}}\\ \mathbf{p_{1_z}} \\ \mathbf{p_{2_x}}\\\mathbf{p_{2_y}}\\ \mathbf{p_{2_z}} \\ \mathbf{p_{3_x}}\\\mathbf{p_{3_y}}\\ \mathbf{p_{3_z}} \end{array} \right]$
I want to find the orthogonal complement of $\mathcal{N}(A)$.
In order to do this I can first find $\mathbf{n_1^{\perp}}$ which is $\in\mathbb{R}^{12\times9}$. The problem is that $\mathbf{n_1^{\perp}}$ spans already the subspace spanned by $\mathbf{n_2},\mathbf{n_3}$. This means that the set $S=\{\mathbf{n_1^{\perp}}, \mathbf{n_2},\mathbf{n_3}\}$ is linear dependent. This is normal also from a logical point of view because it is like we are in $\mathbb{R}^{12}$ and because $\mathbf{n_1}\in\mathbb{R}^{12\times3}$ and $\mathbf{n_1^{\perp}}$ is $\in\mathbb{R}^{12\times9}$ we already covered all the $(3+9)=12$ dimensions. This can also be seen in this way:
$rank(\{\mathbf{n_1},\mathbf{n_1^{\perp}}\})\neq0$
But I said that I want to find the orthogonal complement of $\mathcal{N}(A)$ and for sure the subspace spanned by $\mathbf{n_1^{\perp}}$ contains already the subsubspace spanned by $\mathbf{n_2,n_3}$ so my question is:
how can I find a subspace of $\mathbf{n_1^{\perp}}$ which doesn't contain the subspace spanned by $\mathbf{n_2}, \mathbf{n_3}$?
I hope I was clear and thanks a lot for your time.
If you have a set of generators $W_i$ for a subspace $W$ of a vector space, then the orthogonal complement $W^\perp$ is the intersection of the $W_i^\perp$. It doesn’t matter if any of those orthogonal complements include some of the generators, since that will all get sorted out when you take their intersection. However, there are several ways to find $W^\perp$ without resorting to intersecting vector spaces. Here are a few of the more common ones.
If you have a set of vectors $\{\mathbf w_i\}$ that generates $W$, then for any $\mathbf v\in W^\perp$, $\mathbf w_i\cdot\mathbf v=0$ for all $i$, i.e., $W^\perp$ is the solution set to the homogeneous system of linear equations $A\mathbf v=0$, where the matrix $A$ has as its rows the $\mathbf w_i$.
Other ways involve constructing the orthogonal projection operator onto $W$, $\pi_W$. Then, $\ker\pi_W = \operatorname{im}(1-\pi_W)=W^\perp$, so the problem becomes one of finding the kernel or image of a linear map, which is widely documented elsewhere.
There are several ways to construct a matrix for $\pi_W$. If you have an orthogonal basis $\{\mathbf u_i\}$ for $W$, then you can construct $\pi_W$ as the sum of individual orthogonal projections onto the $\mathbf u_i$: $\sum_i{\mathbf u_i\mathbf u_i^T \over \mathbf u_i\mathbf u_i^T} = \sum_i{\mathbf u_i\mathbf u_i^T \over \mathbf u_i^T\mathbf u_i}$. A similar sum holds for a decomposition of $W$ into a direct sum of mutually orthogonal spaces $W_1\oplus W_2\oplus\cdots\oplus W_k$: $\pi_W = \sum_i\pi_{W_i}$. This is really only useful if you can find the individual projections some other way.
You often have a basis $\{\mathbf u_i\}$ for $W$, but it’s not orthogonal. The Gram-Schmidt process will produce an orthogonal (orthonormal) basis from it, but it’s not necessary to go through that. If you have any basis—not necessarily orthogonal—for $W$, then another fairly easy-to-compute expression for $\pi_W$ is $\pi_W=U(U^TU)^{-1}U^T$, where $U$ has as its columns the vectors $u_i$.
If you’re not using the Euclidean scalar product to define orthogonality, these formulas will need to be modified accordingly. Let $Z$ be the matrix associated with this scalar product, i.e., $\langle\mathbf u,\mathbf v\rangle_Z=\mathbf v^TZ\mathbf u$. Then orthogonal projection onto a vector $\mathbf u$ is given by ${\mathbf u\mathbf u^T \over \mathbf u^TZ\mathbf u}Z$, and the last expression for $\pi_W$ becomes $U(U^TZU)^{-1}U^TZ$. The system of linear equations in the first method would also change to $AZ\mathbf v=0$.