Proving consequences of spectral decomposition of normal operator

509 Views Asked by At

$T$ is a normal operator on finite-dimension complex inner product space $V$.

How do I use the spectral decomposition $T=\lambda_1T_1+\cdots+\lambda_kT_k$ to show:

a) If $T^n=0$ for some n, then $T=0$.

b) Linear operator $U$ commutes with $T$ iff $U$ commutes with each $T_i$.

c) There exists normal operator $U$ s.t. $U^2=T$.

d) $T$ is invertible iff $\lambda_i\neq 0$ and $T$ is a projection iff $\lambda_i=1$ or $0$.

I completed the first two parts, but I'm stuck on c) and d) part 1. I have difficulties wrapping my head around the concepts, so there wasn't much I could do.

For d) part 1, I tried showing that $TT^*=T^*T$ such that $$T^*\overline{T^{-1}}=\overline{T^{-1}}T^*=T\overline{T^{-1}}=\overline{T^{-1}}T= \lambda_1T_1\overline{T^{-1}}+\cdots=\lambda_1\overline{T^{-1}}T_1+\cdots.$$ I also know that $T^*$ has eigenvectors $\overline{λ_i}$, but don't know how to apply that.

2

There are 2 best solutions below

0
On

For a normal operator acting in a finite dimensional space we have% \begin{equation*} T=\sum_{j=1}^{k}\lambda _{j}P_{j} \end{equation*} where the $P_{j}$'s are orthogonal projectors. Then $T$ has empty null space if all $\lambda _{j}$'s are non-zero and hence it is invertible. But if one of them, say $\lambda _{1},$ is 0 (the corresponding term in the sum is not present) then for $f=P_{1}f$ we have $Tf=0$ and $T$ is not invertible. Conversely, if $T$ is invertible then it has empty null space so $Tf\neq 0$ for any non-zero $f$. Projecting with $P_{j}$, $P_{j}Tf=\lambda _{j}P_{j}f\neq 0$ so $\lambda _{j}\neq 0$. For the second part note that \begin{equation*} T=0\times P_{a}+1\times P_{b}=P_{b} \end{equation*} where $P_{a,b}$ are mutually exclusive sums of $P_{j}$'s and $T=P_{b}$. Now suppose that $T$ is a projector. Then \begin{eqnarray*} T^{2} &=&T\Rightarrow \\ \sum_{j=1}^{k}\lambda _{j}^{2}P_{j} &=&\sum_{j=1}^{k}\lambda _{j}P_{j} \end{eqnarray*} Hence \begin{equation*} \lambda _{j}^{2}=\lambda _{j} \end{equation*} so $\lambda _{j}$ is either 1 or 0.

1
On

I know you have no problems in first ones, but I'll answer them.

a):

Take $j \in \{1, \dots, k\}$ and we know that $T^n = T_0$, so we have that $$T^n(x_j) = T_0$$ Now, see that $$T^n(x_j) = \sum^k_{i=1}\lambda^n_iT_i(x_j) = \lambda^n_jT_j(x_j) = \lambda^n_jx_j = 0$$ Remember: for $i \neq j$ we have that $T_i T_j = 0$ and for $i =j$ we have $T_i T_j = T_j$.

We have that $x_j \neq 0$ and this means that $\lambda^n_j = 0 \implies \lambda_j = 0, \forall j$.

Then, we have $$T= \lambda_j T_j = 0 \times T_j = T_0$$

b)

$\Rightarrow$

We know that $T_i$ is polynomial of $T$, then $\exists g(x)$ such that $g(T) = T_i$, and we're going to have $$UT = TU \iff g(T) U = U g(T) \iff T_i U = U T_i $$

$\Leftarrow$

It's easier, just see that: $$UT = U (\lambda_1 T_1 + \dots + \lambda_n T_n) = (\lambda_1 T_1 + \dots + \lambda_n T_n) U = TU$$

c) Take $U = \sqrt\lambda_1 T_1 + \dots + \sqrt\lambda_n T_n \implies U^2 =\lambda_1 T_1 + \dots + \lambda_n T_n = T$.

d)

d.1) (I guess)

We know that $\ker(T) = \{0\}$, by contradiction take $\lambda = 0 \implies T(x) = \lambda x = 0$, for $x \neq 0$.

It implies that $x \in \ker(T) \implies x=0$, and it is a contradiction.

d.2)

$\Rightarrow$

Take T a projection over $W$, then $\exists W'$ such that $V = W \oplus W'$.

If $x = x_1+ x_2, x_1 \in W, x_2 \in W'$ such that $$T(x) = x_1$$

Now, take $\lambda$ eigenvalue of $T$, then $\exists x \neq 0$ such that $$T(x) = \lambda x = \lambda (x_1 + x_2)$$

So, we have that $$x_1 = \lambda (x_1 + x_2) \iff x_1 (1 - \lambda) = \lambda x_2$$ And we conclude that $\lambda$ must be $0$ or $1$.

$\Leftarrow$

Suppose there is at least one $\lambda = 0$ and one $\lambda = 1$, and take $T = \lambda_1 T_1 + \lambda_2 T_2$. Without loss of generality take $\lambda_2 = 0$, so $$T = 1 T_1 + 0 \times T_2 = T_1$$ Then $T$ is a projection.