Alternating projection convergence

216 Views Asked by At

Let $M,N$ be two subspaces of a finite-dimensional vector space $V$ with an inner product. Define $L=M\cap N$ and let $P_M,P_N,P_L$ be the orthogonal projection maps. I'm trying to show that the alternating compositions of $P_M$ and $P_N$ converges point wise to $P_L$. I'm having some trouble showing this, and I wasn't sure how to proceed with this. I'm self-studying from the book "The Coordinate Free Approach to Linear Models.

EDIT: Let $A_1=P_M$ and $A_2=P_NP_M$ and $A_3=P_MP_NP_M$ and construct a sequence of operators $A_n$ in this fashion. I want to show that $A_nx$ converges to $P_Lx$ for any $x\in V$.

2

There are 2 best solutions below

2
On BEST ANSWER

Note first that $||P_M||=||P_N|| = 1$, and $||P_M v||<||v||$ for any $v\notin M$. Note further that both $L$ and $L^\perp$ are invariant subspaces of $P_M$ and $P_N$, and hence also of $P_N P_M$.

Any element $v\in L$ has $P_N P_M v = v$. Any element $v\in L^\perp$ cannot be in both $N$ and $M$, so suppose first that it is not in $M$. Then $||P_N P_M v||\leq ||P_M v|| < ||v||$. Similarly, if $v\notin N$ then $||P_N P_M v||<||v||$. By continuity, the norm of $P_N P_M$ on the subspace $L^\perp$ must be $<1$.

So decompose any vector $x$ as $y+z$ with $y\in L$ and $z\in L^\perp$. Then for any integer $n$ we have $(P_N P_M)^n y = y$ and $||(P_N P_M)^n z|| \leq \lambda^n \cdot ||z||$ for some fixed constant $\lambda <1$ (namely, $||P_N P_M||$). Hence $(P_N P_M)^n x \rightarrow y$ as $n\rightarrow \infty$, as desired.

2
On

You don't need an (infinite) sequence before you get $P_Lx$.

Since you have an inner product and you are with a finite dimensional space, there exists an orthogonal basis for $L$, denoted as $\{v_i\}_{i=1}^l$. Extend that to $M$ and $N$ respectively and denote the "extensions" as $\{v_i\}_{i=l+1}^{l+m}, \{v_i\}_{i=l+m+1}^{l+m+n}$ and $\{v_i\}_{i=l+m+n+1}^{l+m+n+k}$ where $l, l+m, l+n, l+m+n+k$ are dimensions of $L, M, N, V$.

Let's use the bra-ket notation. It's convenient. Note that bra-ket notation assumes the inner product is linear on the second vector, which is the opposite of the usual inner product notation.

In bra-ket notation, for $|x\rangle$ denotes vector $x$, $\langle x|$ denotes the dual vector given the inner product. $\langle x|\cdot\rangle$ denotes taking inner product with $x$. If $x$ is normal, then $\langle x|\cdot\rangle|x\rangle$, also written as $|x\rangle\langle x|$, is the projector onto $x$ or the span of $x$.

Given any $x\in V$,

$$ x=\sum_{i=1}^{l+m+n+k}\langle v_i|x\rangle |v_i\rangle $$

where

$$ \sum_{i=1}^{l}\langle v_i|x\rangle |v_i\rangle\in L,\;\sum_{i=l+1}^{l+m}\langle v_i|x\rangle |v_i\rangle\in M,\;\sum_{i=l+m+1}^{l+m+n}\langle v_i|x\rangle |v_i\rangle\in N $$ and $$ \langle v_i|v_j\rangle=0\quad\forall i,j \\ \therefore\quad P_Mx=\sum_{i=1}^{l+m}|v_i\rangle\langle v_i|\sum_{i=1}^{l+m+n+k}\langle v_i|x\rangle |v_i\rangle=\sum_{i=1}^{l+m}\langle v_i|x\rangle|v_i\rangle $$ and $$ \quad P_NP_Mx=(\sum_{i=1}^{l}|v_i\rangle\langle v_i|+\sum_{i=l+m+1}^{l+m+n}|v_i\rangle\langle v_i|)\sum_{i=1}^{l+m}\langle v_i|x\rangle|v_i\rangle=\sum_{i=1}^{l}\langle v_i|x\rangle|v_i\rangle $$

You can do the same thing with $P_L$ to find the $RHS=P_Lx$.

It's really just writing out the orthogonal decomposition of $x$. It becomes more interesting in the infinite dimensional case. But that's not the question.