When do similarity transformations preserve Hermitianity?

401 Views Asked by At

Let $H$ be a Hermitian matrix ($H=H^\dagger = (H^T)^*$) and $S$ be an invertible matrix. Denote $\tilde{H} = S^{-1}HS$.

Questions:

(a) What conditions on $S$ ensure that the matrix $\tilde{H}$ is Hermitian for arbitrary $H$?

(b) If $H$ is fixed, what joint conditions on both $H$ and $S$ ensure that the matrix $\tilde{H}$ is Hermitian?

In both cases, there is at least one clear sufficient condition. If $S$ is a multiple of a unitary matrix ($S=\lambda U$, $\lambda\in\mathbb{C}\setminus\{0\}$, $UU^\dagger = I$), $\tilde{H}$ is always Hermitian. I would conjecture this also might be necessary in the case of (a), but not (b) (where $H=0$ is an obvious counterexample). However, I haven't been able to prove this.

As a bonus question, do the answers of these questions change in any meaningful way if $H$ is instead a self-adjoint operator acting on an infinite-dimensional Hilbert space?

Apologies if this has been answered somewhere else, I haven't been able to find it. Thank you in advance!

1

There are 1 best solutions below

1
On BEST ANSWER

Note that $\tilde H$ is Hermitian if and only if we have $$ (S^{-1}HS)^\dagger = S^{1}HS \iff\\ S^\dagger H S^{-\dagger} = S^{-1}HS \iff\\ SS^\dagger H = HSS^\dagger. $$ Regarding question a: we see that $S$ is such that $S^{-1}HS$ is Hermitian for arbitrary Hermitian $H$ if and only if the matrix $M$ is such that $HM = MH$ for all Hermtian matrices $H$. Because every matrix $A$ can be written in the form $A = H + iK$ for Hermtian matrices $H,K$, we can conclude that $MA = AM$ for arbitrary matrices $A$.

It is well known that if $AM = MA$ holds for all matrices $A$, then $M$ must be a multiple of the identity. That is, we have $SS^\dagger = \lambda I$ for some $\lambda \in \Bbb C$; because $S$ is invertible, $SS^\dagger$ is positive definite, which implies that $\lambda > 0$. With that, we can conclude that $\frac 1{\sqrt{\lambda}}S$ is a unitary matrix.

Regarding b: if $H$ is fixed, we can apply a unitary change of basis so that without loss of generality, $H$ is a real, diagonal matrix. In particular, we may suppose that $$ H = \pmatrix{\lambda_1 I_{k_1} \\ & \ddots \\ && \lambda_m I_{k_m}}, $$ where $I_k$ denotes the identity matrix of size $k$ and $\lambda_1,\dots,\lambda_m \in \Bbb R$ are distinct. Because $M = SS^\dagger$ commutes with $H$, we can show that $M$ must have the form $$ M = SS^\dagger = \pmatrix{A_1 & \\ & \ddots\\ & & A_{k}}, $$ where each $A_j$ is square of size $k_j \times k_j$. If we divide $S$ into block rows with $$ S = \pmatrix{S_1 \\ \vdots \\ S_k}, $$ then the equation above tells us that we have $S_jS_k^\dagger = 0$ holds for all pairs $j \neq k$. Note that if $k_j = 1$ for $j = 1,\dots,m$, then this merely tells us that the rows of $S$ are mutually orthogonal; it does not guarantee that the rows have the same length (which would imply that $S$ is unitary).

The answer to a does not meaningfully change the infinite dimensional setting. The answer to b probably does because we can no longer "diagonalize" operators in the same way.