Invariant subspace of $T$ (normal) is also an invariant subspace of $T^\ast$.

1.4k Views Asked by At

I am struggling with the following question

Let $V$ be a finite dimensional vector space and $T:V\rightarrow V$ be a linear normal operator ($T^\ast T = TT^\ast$), and $W$ an invariant subspace of $T$ ($T(W)\subseteq W)$. Prove $W$ is also an invariant subspace of $T^\ast$.

The problem I have is characterizing something like $T^\ast w \in W$ when all that is given is in the language of inner products. I thought of maybe decomposing $T^\ast w = u+v$ where $u \in W,\ v\in W^\perp$, and showing $v=0$ by $\left <T^\ast w, v \right >=0$, but $T$ being normal does not help when there is "only one $T$" inside the inner product. Is multiplying both sides by $T$ any help? because then we can use normality but I don't know where it leads us.

I know this is true since using the unitary diagonalization, I can express $T^\ast$ as a polynomial in $T$ and from there it's easy ($W$ is $p(T)$ invariant regardless of the polynomial itself), but I would like to see a more fundamental solution.

1

There are 1 best solutions below

2
On BEST ANSWER

$\newcommand{\Tr}{\mathrm{Tr}}$ $\newcommand{\diag}{\mathrm{diag}}$ Suppose $\dim(W) = k, 1 \leq k < n$ (where $n = \dim(V)$). Then there exists an orthonormal basis $\{\alpha_1, \ldots, \alpha_k, \alpha_{k + 1}, \ldots, \alpha_n\}$ of $V$, where $\{\alpha_1, \ldots, \alpha_k\}$ is an orthonormal basis of $W$, such that \begin{align*} & T(\alpha_1, \ldots, \alpha_k, \alpha_{k + 1}, \ldots, \alpha_n) = (\alpha_1, \ldots, \alpha_k, \alpha_{k + 1}, \ldots, \alpha_n)\begin{pmatrix} A_{11} & A_{12} \\ 0 & A_{22} \end{pmatrix}; \\ & T^*(\alpha_1, \ldots, \alpha_k, \alpha_{k + 1}, \ldots, \alpha_n) = (\alpha_1, \ldots, \alpha_k, \alpha_{k + 1}, \ldots, \alpha_n)\begin{pmatrix} A_{11}^* & 0 \\ A_{12}^* & A_{22}^* \end{pmatrix}. \end{align*} Since $T$ is normal, we have \begin{align*} \begin{pmatrix} A_{11} & A_{12} \\ 0 & A_{22} \end{pmatrix} \begin{pmatrix} A_{11}^* & 0 \\ A_{12}^* & A_{22}^* \end{pmatrix} = \begin{pmatrix} A_{11}^* & 0 \\ A_{12}^* & A_{22}^* \end{pmatrix} \begin{pmatrix} A_{11} & A_{12} \\ 0 & A_{22} \end{pmatrix}. \end{align*} Comparing the northwestern blocks of the above equation, we obtain $$A_{11}A_{11}^* + A_{12}A_{12}^* = A_{11}^*A_{11},$$ which implies $\Tr(A_{11}A_{11}^*) + \Tr(A_{12}A_{12}^*) = \Tr(A_{11}^*A_{11})$, hence $\Tr(A_{12}A_{12}^*) = 0$. This means $A_{12} = 0$. Therefore \begin{align*} & T(\alpha_1, \ldots, \alpha_k, \alpha_{k + 1}, \ldots, \alpha_n) = (\alpha_1, \ldots, \alpha_k, \alpha_{k + 1}, \ldots, \alpha_n)\diag(A_{11}, A_{22}); \\ & T^*(\alpha_1, \ldots, \alpha_k, \alpha_{k + 1}, \ldots, \alpha_n) = (\alpha_1, \ldots, \alpha_k, \alpha_{k + 1}, \ldots, \alpha_n)\diag(A_{11}^*, A_{22}^*). \end{align*} Therefore $W$ is also $T^*$-invariant. This completes the proof.