if $T$ is normal and a projection in a finite dimensional vector space $V$, then $T$ is an orthogonal projection

937 Views Asked by At

Let $T$ be a normal operator ($TT^* = T^*T$) defined in a finite dimensional vector space $V$ with inner product ($<v, w>$). Then, if $T$ is a projection ($T^2 = T$) then $T$ is an orthogonal projection ($N(T)^\bot = R(T)$ and $R(T)^\bot = N(T)$).

where $N(T)$ is the Kernel of $T$ and $R(T)$ is the Range or Image of $T$.

Attempt:

by taking $y \in R(T)$ and $w\in N(T)$ I'm trying to prove that $N(T)^\bot = R(T)$ . I must prove $\langle y,w\rangle = 0$, which means that $N(T) = R(T)^\bot$

$\langle y,w\rangle = \langle T(x),w\rangle = \langle x, T^*(w)\rangle $ Then I dont know what to do. Also

$\langle y,w\rangle = \langle T(x),w\rangle = \langle T^2(x), w\rangle $ Then I don't know what to do.

Also, I know that $T$ is an orthogonal projection if and only if $T$ is autoadjunct ($T^*=T$)

Therefore, I could try to prove that $T$ is autoadjunct. But I don't know if this is even true. Any hint?

3

There are 3 best solutions below

0
On BEST ANSWER

As V is finitely dimensional then is sufficient to prove $\mathcal N(T) = \mathcal R(T)^\bot $

let be $x \in \mathcal R(T)^\bot$ then

$0 = <x, \ T(T^*(x))>\ =\ <x,\ T^*(T(x))>\ =\ <T(x),\ T(x)>$

i.e $\ <T(x),\ T(x)> = 0 \iff T(x) = 0 \iff x \in \mathcal N(T)$

Therefore $R(T)^\bot \subseteq N(T)$

To check $N(T) \subseteq R(T)^\bot $ is the same backwards

let be $x \in \mathcal N(T)$ then

$0= <T(x),\ T(x)> = <x,\ T^*(T(x))> = <x,\ T(T^*(x))>$

Therefore $<x,\ T(T^*(x))> = 0$ with $T(T^*(x)) \in R(T) \implies x \in \mathcal R(T)^\bot$

Therefore $N(T) \subseteq R(T)^\bot $

Then $N(T) = R(T)^\bot $

Thus $T$ is an orthogonal projection

0
On

Proof Outline: First of all, note that showing $\mathcal N(T) \perp \mathcal R(T)$ is not sufficient, we also need to know that $\mathcal N(T) + \mathcal R(T) = V$. One way to do so is to note that every element of $v$ can be written as $$ v = (v - Tv) + Tv. $$ To show that the subspaces are orthogonal, proceed as follows. Using the fact that $T$ is normal, show that $\mathcal N(T^*) = \mathcal N(T)$. Similarly, show that $\mathcal R(T) = \mathcal N(T - I) = \mathcal N([T - I]^*)$. From there, your initial approach should work.

0
On

First assume $V$ is a complex vector space. Since $T$ is normal, the matrix $A$ of $T$ is diagonal with respect to some orthonormal basis $v_1, \dots, v_n$ of $V$, say $A = \text{diag}(\lambda_1, \dots, \lambda_n)$. Since $T^2 = T$, it follows that each $\lambda_j$ is $0$ or $1$. So $T$ is the orthogonal projection onto the span of $\{v_j : \lambda_j = 1\}$.

If $V$ is a real vector space, then you can use the characterization of normal operators on a real vector space instead. The matrix $A$ is diagonal with blocks of the form $\begin{pmatrix} a & -b \\ b & a\end{pmatrix}$, $\lambda$.