About projection and orthogonal projection

1.5k Views Asked by At

Let $V$ be an inner product space. Given any linear operator $T$ on $V$ is it always possible that $V=N(T)+R(T)$, if so how to prove it? Furthermore, is it correct to say that not every such $T$ s.t. $V=N(T)\oplus R(T)$ but if so, it uniquely determine an orthogonal projection? I'm thinking about what's needed to make a linear operator $T$ a projection, and what's needed to further make such $T$ become orthogonal projection.

I'm learning linear algebra so normally everything should be considered finite-dimensional, but the definitions of (orthogonal-)projection I read are not restricted on finite-dimensional, and it's quite confusing, for example if $V$ is infinite dimensional, and $W$ is a finite dimensional subspace of $V$, then $V=W\oplus W^\perp$, but is this still correct if $W$ is not finite dimensional?

2

There are 2 best solutions below

9
On BEST ANSWER
  1. First, $N(T)^\perp = R(T^*)$ is always true in a finite dimensional space. Suppose rows of $T$ is $r_1,\cdots, r_n$, then if $x\in N(T)$ $$ Tx=\begin{bmatrix} \langle r_1,x \rangle\\ \langle r_2,x \rangle\\ \vdots \\ \langle r_n,x \rangle \end{bmatrix}=0 $$ and $x$ is perpendicular to the generators of $R(T^*)$. Also, in the finite dimensional space, $V=W \oplus W^\perp$ is always true, so $V=N(T)\oplus R(T^*)$.

    If $R(T)=R(T^*)$, which is equivalent to $T$ being orthogonal $T=T^*$, then $V=N(T) \oplus R(T)$. In a finite dimensional real vector space, this is equivalent to $T$ being symmetric.

  2. So $V=R(T^*) \oplus N(T)$ (of course) does not uniquely determine $T$. But if you assume that $T$ is an orthogonal projection, then $T$ is uniquely determined. For each $x\in V$, $x=x_R + x_N$ where $x_R\in R(T^*)$ and $x_N\in N(T)$ are uniquely determined. Then let $S(x):=x_R$. This is an orthogonal projection. We only need to prove the $T=S$, so compute the following $$ \langle (T-S)x,(T-S)x\rangle =\langle Tx, Tx \rangle - \langle Tx, Sx \rangle - \langle Sx, Tx \rangle + \langle Sx, Sx\rangle \\ =\langle T(1-S)x, x\rangle+ \langle S(1-T)x, x\rangle=0 $$ since $(1-S)x, (1-T)x \in N(T)$. Thus $(T-S)(x)=0$ and $T=S$.
  3. When you are working with the inner product space, it is natural to consider it as a topological space with topology given by the distance $\|x-y\| = \sqrt{\langle x-y,x-y\rangle}$. If you are not familiar with a topology, just keep in mind that $x_n$ converges to $x$ if and only if $\|x-x_n\|$ converges to $0$; $x_n$ is a Cauchy sequence if and only if $\|x_n-x_m\|$ converges to $0$ when $n\geq m \to \infty$. Convergence of a sequence implies that it is Cauchy, but the converse is not generally true. Then the completeness of this space (that is, every Cauchy sequence converges) is usually crucial. For example, consider the space of polynomial functions defined on the closed interval $(0,1)$, and define the innerproduct as $\langle f, g \rangle = \int_0^1 f(x)g(x)dx$. Then this is not complete since the following sequence of polynomials is a Cauchy sequence but does not converges: $$ 1, 1+x, 1+x+\frac{x^2}{2!}, 1+x+\frac{x^2}{2!}+\frac{x^3}{3},\cdots $$ If you are familiar with the Taylor expansion, you will see that this converges to $e^x$, but $e^x$ is defnitely not a polynomial. Of course we need to check that this is a Cauchy sequence, but I will skip it. This is the reason why we are usually dealing with $l^2$ space or $L^2$ space $$ L^2([0,1]) = \left\{f:[0,1]\to \mathbb{R}|\int_0^1 \{f(x)\}^2dx <\infty \right\}\\ l^2(\mathbb{N}) = \left\{a_n :\mathbb{N} \to \mathbb{R} | \sum_{i=0}^\infty a_n^2 \leq \infty\right\} $$ and those $complete$ inner product spaces are called a Hilbert space.
  4. However, in a Hilbert space $H$, $H= W \oplus W^\perp$ is still not generally true for any subspace $W$. For example, let $H= l^2(\mathbb{N})$ where the inner product of $(a_n)$ and $(b_n)$ is given by $\sum_n a_n b_n$. Now take $W$ to be a subspace of $H$ whose elements are of the form $(a_1, a_2, \cdots)$ and $a_i$'s are all zeroes except for finitely many terms. This is not a closed subspace―closedness means every limit is in itself―since $$ x_k=(1,1/4,1/9,\cdots, 1/k^2,0,0,0,\cdots) $$ are all in $W$ but its limit is not in $W$, which is $$ x=(1,1/4,1/9,\cdots). $$ It is easy to see that $W^\perp=0$, since if $(a_n)\in H$ is nonzero, there is a nonzero term $a_k\neq 0$ and $$ (b_n)=(0,0,\cdots, 0, \stackrel{\textrm{$k$th}}{a_k}, 0 ,\cdots)\in W $$ and $\langle (a_n), (b_n) \rangle = a_k^2 \neq 0$.
  5. When $W$ is closed, $H = W \oplus W^\perp$ is still true. Find the orthonormal (Hilbert) basis $\{v_1, v_2, v_3, \cdots \}$ of $W$, and $$ \sum_{i=1}^\infty \langle x , v_i \rangle v_i $$ always converges and is in $W$ by the closedness. Then it is easy to check that $$ x-\sum_{i=1}^\infty \langle x , v_i \rangle v_i \in W^\perp $$ and thus we have the unique expression $$ x= \sum_{i=1}^\infty \langle x , v_i \rangle v_i + \left(x-\sum_{i=1}^\infty \langle x , v_i \rangle v_i \right) $$ proving $H = W \oplus W^\perp$.

*Edit: I added the last paragraph.

  1. Finally, if $T$ is an operator on a Hilbert space, and if it is continuous (which is equivalent to the boundedness of $T$), $N(T)=T^{-1}(0)$ is still closed since $0$ is closed, we get $H=N(T) \oplus N(T)^\perp$. If $T$ has its dual operator $T^*$, which means $\langle Tx,y\rangle = \langle x,T^*y \rangle$, then $H= \textrm{Im} (T^*) \oplus \textrm{Ker} T$.
0
On

By the dimension theorem, $\dim V=\dim N(T) +\dim R(T)$, so whenever these have nontrivial intersection (e.g. with $T(x, y) =(0, x)$) we don't have $V=N(T) +R(T) $.

A linear operator $T$ is a projection iff it is idempotent, i.e. $T^2 =T$.
Then any vector $x$ can be decomposed as $x=(x-Tx) \ +\ Tx\ \in N(T) +R(T)$, and if $x\in N(T) \cap R(T)$, then $Tx=0$ and $x=Ty$ for some $y$, so $x=Ty=TTy=Tx=0$.
It shows that in this case we indeed have $V=N(T)\oplus R(T)$, and $T$ effectively projects $a+b\mapsto b$.
Conversely, if $V=A\oplus B$ then the projection $a+b\ \mapsto b$ is clearly idempotent.

To talk about orthogonality, an inner product has to be considered on $V$, which induces an adjoint $T^*$ for every linear operator $T$, uniquely determined by the equation $\langle Tx, y\rangle =\langle x, T^*y\rangle$ (in an orthonormal basis, the matrix of $T^*$ is just the (complex conjugated) transpose of the matrix of $T$).

Now, if $T^2=T$, we will have $N(T) \perp R(T)$ iff $T^*=T$ (selfadjoint).
Supposed $T^*=T=T^2$, if $a\in N(T) $ and $b\in R(T) $, we have $$\langle a, b\rangle =\langle a, Tb\rangle =\langle a, T^*b\rangle =\langle Ta, b\rangle = 0$$ meaning $a\perp b$.

Conversely, if $A\perp B, \ V=A\oplus B$ and $T=a+b\mapsto b$ is an orthogonal projection, then, with the decompositions $x=x_A+x_B, \ y=y_A+y_B$, we have $$\langle Tx, y\rangle = \langle x_B, \, y_A+y_B\rangle =\langle x_B, y_B\rangle=\langle x, Ty\rangle$$ showing $T^*=T$.