If $N$ commutes with $T$, it commutes with the transposed conjugate of $T$

74 Views Asked by At

Given a normal linear function $ N : V \to V $ and a linear function $ T : V \to V$ and given $ TN = NT$ prove $$T^H N = NT^H$$

What I tried:

I used the given to produce the following equations: $$ 1. TN = NT $$ $$ 2. N^HN = NN^H $$ and I proved the conjecture's correctness for $TN=NT=0$

but I couldn't figure out how to do it generally, any help?

1

There are 1 best solutions below

0
On BEST ANSWER

Put $n = \dim V$.

Recall that, since $N$ is normal, there are a unitary matrix $U$ and a diagonal matrix $M$ such that $M = U^H N U$. You can find that on the Wikipedia page. $U$ being unitary means $U(U^H) = E = (U^H)U$ where $E$ denotes the identity matrix.

Also, we can assume that $M$ has the form $$ (i) \qquad \qquad \qquad M = diag(\delta_1, \ldots, \delta_1, \delta_2, \ldots, \delta_2, \ldots, \delta_{m-1},\ldots\delta_{m-1},\delta_m,\ldots,\delta_m), $$ where $diag(\cdots)$ denotes a diagonal matrix with diagonal entries $(\cdots)$ in the given order. In other words, we have arranged equal diagonal entries of $M$ into blocks. Also, we have $m \leq n$. (For a rigorous proof of this, we must use the fact that permutation matrices are unitary, but I'll gloss over this here. :-) Moreover, let's write $\mu_1$ for the number of $\delta_1$s in (i), $\mu_2$ for the number of $\delta_2$s in (i), and so on.

Put $S = U^H T U$. Then it's easy to check that, since we have $NT = TN$, we also have $SM = MS$ (please check that).

Now, let's consider the matrices $S$ and $M$. For $j \in \{ 1, \ldots, n \}$, denote by $e_j$ the column vector with $1$ in the $j$th coordinate and $0$ elswhere. Since $M$ is diagonal, $e_j$ is an eigenvector of $M$ with eigenvalue, say, $\lambda_j$. Of course, $\lambda_j$ is one of the $\delta_k$s from (i) above.

For $k \in \{1, \ldots, m\}$, let $W_{\delta_k}$ be the subspace of $V$ spanned by the set $\{ e_j \mid \lambda_j = \delta_k \}$. Then, the following properties are not so difficult to check (please do that).

(ii) For $k \in \{1, \ldots, m\}$: $W_{\delta_k} \neq 0$.

(iii) $V = \oplus_{k=1}^m W_{\delta_k}.$

(iv) For $k \in \{1, \ldots, m\}$: $W_{\delta_k}$ is the eigenspace of $M$ for the eigenvalue $\delta_k$.

(Please check these. (ii) and (iii) are "obvious", (iv) requires a little bit of work).

Now we make use of $SM = MS$. Pick a $w \in W_{\delta_k}$. Then $$ M(Sw) = (MS)w = (SM)w = S(Mw) = S(\delta_k w) = \delta_k (Sw). $$ So, $Sw$ is either $0$ or an eigenvector of $M$ for the eigenvalue $\delta_k$. Using (iv) above, we conclude $Sw \in W_{\delta_k}$, or, in other symbols $$ (v) \qquad \qquad \qquad \qquad \qquad S(W_{\delta_k}) \subseteq W_{\delta_k}. $$ Now, from (i), (iii), (v), and the definition of $W_{\delta_k}$, we conclude that $S$ must be a block diagonal matrix, where the first block has dimension $\mu_1$, the second block has dimension $\mu_2$, and so on.

But then $S^H$ also is a block diagonal matrix with blocks of the same dimensions.

But then, using (i), it's easy to check by direct calculation that we have $$ S^HM = MS^H. $$

This implies $$ (vi) \qquad \qquad \qquad (US^HU^H)(UMU^H) = US^HMU^H = UMS^HU^H = (UMU^H)(US^HU^H). $$

Using the definitions of $M$ and $S$, we find $$ (vii) \qquad \qquad \qquad UMU^H = N $$ and $$ (viii) \qquad \qquad \qquad US^HU^H = (USU^H)^H = T^H. $$

Putting (vi), (vii), and (viii) together, we have $$ T^H N = N T^H, $$ as desired.