Matrix expressions for the oblique projection onto subspace L in the direction of subspace K

575 Views Asked by At

In the past, I have had to write 3D visualization programs where, in a natural way, oblique projections onto a plane where needed. Each time, I had to develop a specific routine. Later on, I discovered that there exist (not really simple) "compact" matrix expressions for that.

Let us consider a more general framework.

Let $K$ and $L$ be two supplementary subspaces of $\mathbb{R}^n$:

$$\mathbb{R}^n=K \oplus L\tag{1}$$

(we have chosen to take $\mathbb{R}^n$ as ambient space for the sake of simplicity).

In the following, (1) will be concretely converted into

$$M=[K|L]\tag{2}$$

where $M$ is a $n \times n$ matrix with full rank whose first $n-r$ columns constitute a basis of $K$ and whose last $r$ columns constitute a basis of $L$. Please note that $K$ is a "Kernel", $r$ being the rank of the projection matrix we are working on.

What is this compact matrix expression ?

The simplest (sic) formula I have found in the litterature for the oblique projection onto subspace $L$ in the direction of subspace $K$ is this one:

$$P_{L/K}=L(L^TP_KL)^{-1}P_K \ \text{where} \ P_K:=I_n-KK^{+}\tag{3}$$

where upper index $+$ is for the Moore-Penrose inverse: $$M^{+}:=(M^TM)^{-1}M^T$$

The proof of (3) is straightforward: $K$ is the kernel of the transformation (indeed $P_{L/K}K=0$) and $L$ its range.

I have recently seen another expression (with proof) in this signal processing reference:

$$P_{L/K}=P_L(I_n-K(K^TP_LK)^{-1}KP_L) \ \text{where} \ P_L:=I_n-LL^{+}\tag{4}$$

more or less dual to formula (3). See also this [detailed one].(https://www.frontiersin.org/articles/10.3389/fphys.2016.00515/full)

My questions:

  1. Are there still other formulas for $P_{L/K}$ ? Are there cases where a certain simplification occurs ? Are there proofs of them that can be appealing (unveiling analogies, etc.) ?

  2. Are there (other) applications where oblique projections are needed ?


Edit: I found a good answer here.

2

There are 2 best solutions below

0
On

I don't know if this constitutes an answer (to a 49k asker!) but since I also sometimes need to use such oblique projections, this is how I do it computationally:

The projection can be written as $P=LW$ where $$WL=I,\qquad WK=0$$ The second identity ensures that every basis vector in the kernel is annihilated, while the first ensures that $P$ is a projection, $P^2=WLWL=WL=P$.

The $r\times n$ matrix $W$ is equal to any of a number of formulas, but I find that simply treating its coefficients as $rn$ variables with the above equations furnishing $r^2+r(n-r)=rn$ equations on them, is a faster way to find $W$ than finding Moore-Penrose inverses and so on. This can be done by a standard Gaussian elimination algorithm on $$\begin{bmatrix}L^T\\K^T\end{bmatrix}W^T=\begin{bmatrix}I\\O\end{bmatrix}$$

1
On

The idea is that the projection $P$ is a conjugate of the standard projection $\pi$ with matrix block diagonal $(O_k, I_r)$. We have
$$P = M\circ \pi \circ M^{-1}$$