Finding a basis $\beta$ to yield a diagonal matrix for a projection $T:V\to V$ on $W$ along $W'$

209 Views Asked by At

Is the following Proof Correct? In particular i would like your thoughts concerning the way we prove the linear independence $\beta$ using $W\cap W' =\{0\}$. (I know it is some what strange but i am trying to stay true to the definition of the direct sum as presented in the text).

Please First Consider the following Definition.

Definition. $V = W\oplus W'$ if and only if $V = W+W'$ and $W\cap W' = \{0\}$.

Definition. A Linear map $T:V\to V$ is a projection on $W$ along $W'$ if and only if $V = W\oplus W'$ and $T(x_1+x_2) = x_1$ whenever $x_1\in W$ and $x_2\in W'$.

The Following is the proposition that we are required to prove.

Proposition. Let $V$ be a finite dimensional vector space and $T$ is a projection on $W$ along $W'$, where $W$ and $W'$ are subspaces of $V$. Find an ordered basis $\beta$ for $V$ such that $[T]_\beta$ is a diagonal matrix.

Proof. Let $\alpha = \{v_1,v_2,...,v_n\}$ and $\gamma = \{u_1,u_2,...,u_m\}$ be basis for $W$ and $W'$ respectively. We define $\beta = \alpha\cup\gamma$ and show it to be a basis for $V$.

Assume that $0 = \sum_{i=1}^{n}c_iv_i+\sum_{i=1}^{m}d_iu_i =w $. From hypothesis $V = W\oplus W'$ consequently $W\cap W' = \{w\}$, then by considering $w\in W$ it follows that $c_1=c_2=\cdot\cdot\cdot=c_n = 0$ and by the same consideration for $W'$ we have $d_1 = d_2 = \cdot\cdot\cdot = d_m = 0$ implying the linear independence of $\beta$. In addition since $V = W+W'$ it is evident that $V = \operatorname{span}(\beta)$.

Finally by considering the $T$ is a projection on $W$ along $W'$ it follows that $T(\alpha) = \alpha$ and $T(\gamma) = \{0\}$ then by considering the definition of the matrix representation of $T$ under the basis $\beta$ it follows that $[T]_\beta$ is diagonal.

$\blacksquare$

1

There are 1 best solutions below

0
On BEST ANSWER

Your proof seems fine to me. I am quite sure that you are right about linear independence but since you asked for thoughts, here is an alternative using the following property:

Property. Let $E$ be a vector space, $F,G$ two subspaces of $E$. If $E=F\oplus G$, then $\forall x\in E, \exists !(x_F,x_G)\in F\times G$; $x=x_F+x_G$.

Then, you will easily notice that, in your case, $0_V=0_W+0_{W'}$ hence the result. Thus solving $u=0$ for $u\in V$ boils down to $u_W+u_{W'}=0$ which yields $u_W=0_W$ and $u_{W'}=0_{W'}$ and the only decomposition of the zeroes in whatever basis is made of zeroes only.