Given matrices H, K that are Hermitian, with K=AH, H nonnegative, is A necessarily normal?

42 Views Asked by At

Given Hermitian matrices $H$ and $K$ related by $AH=K$, $H$ nonnegative, is it necessarily true that $A$ is normal? $H$ may be assumed to be positive if that helps.

Edit: Additional answer when $K$ and $H$ are both positive definite Hermitian.

We have $AH=HA^*$

$A$ is invertible and has no eigenvalue on the negative real axis. In fact all of the eigenvalues of $A$ must be positive real. Therefore, we can determine a square root, $A^{1/2}$ (Shur decomposition $A=SUS^*$, $S$ unitary and $U$ upper triangular, is used). Similarly we find $A^{*{1/2}}$.

Now we may form

$K_1=A^{{1/2}}KA^{*{-{1/2}}}=A^{1/2}HA^{*{{1/2}}}$

which is positive.

Any pair of Hermitian positive definite matrices $H_1$ and $H_2$ are conjugates via a conjugation

$H_2=TH_1T^*=TH_1T$

with $T$ Hermitian. (Proof below.)

Thus we have $K_1=THT^*=THT$ with $T$ Hermitian and positive. Combining, we obtain

$THT=A^{1/2}HA^{*{{1/2}}}$

Let $S=A^{-{1/2}}T$. Then

$SHS^*=H$

Let $H=LL^*$ be the unique Cholesky decomposition of $H$. Then we have

$SL=L$ by uniqueness. Since $L$ is invertible, we must then have $S=I$ and finally $A=T^2$.

Thus, $A$ must be normal.

The above argument also proves uniqueness of the matrix $T$ in the Lemma below.

Lemma:

Let $A$ and $B$ be positive definite Hermitian. Then $B=TAT$, $T$ Hermitian, and positive.

Proof

$B=\Omega_B^2$

Let

$C=\Omega_BA\Omega_B = \Omega_C^2$

Then

$I=\Omega_C^{-1}C\Omega_C^{-1}$

$B=\Omega_B\Omega_C^{-1}\Omega_BA\Omega_B\Omega_C^{-1}\Omega_B = TAT$

**************** OLD ANSWER *********

Clearly, if $A$ is a real function of $H$ defined on the spectrum of $H$, the equation $AH=K$ can hold. It also holds if $A=0$ or $A=I$ or if $A$ is any scalar matrix.

One approach I have tried is, starting with $AH=HA^*$, deduce that $(Hw,A^*v)=(A^*w,Hv)$ for all vectors $v$ and $w$. Let $v_1$ be an eigenvector of $H$. Then $AHv_1=A\lambda v_1=HA^*v_1$. Then

$(A^*v_1,HA^*v_1)=\lambda (A^∗v_1,Av_1)=(Hv_1,A^2v_1)$

Rewriting this as quadratic forms with kernel H, we get

$(v_1,AA^*v_1)_H=(v_1,A^2v_1)_H$

for all eigenvectors of $H$. Evidently, the subscript $H$ on the inner product can be dropped.

Similarly

$(A^*v_2,HA^*v_1)=\lambda _1(A^∗v_2,Av_1)=\lambda_1/\lambda_2(Hv_2,A^2v_1)$

Rewriting this as quadratic forms with kernel H, we get

$(v_2,AA^*v_1)_H=\lambda_1/\lambda_2(v_2,A^2v_1)_H$

These equations are suggestive, but I don't think it is enough.

Another available fact is that, for admissible functions $f(X)$, $X$ a matrix, we have

$f(A)H=Hf(A^*)$.

So, if the function $f(z)=z^*$ (complex conjugate) is defined for $A$ we can repeat the above with $A$ replaced by $A^*$.

This leads to similarity relations between $A$ and $A^*$.

$HAH^{-1}=A^*$

$HA^*H^{-1}=A$

$H^2AH^{-2}=A$

This gives the Sylvester equation

$H^2A-AH^2=0$ whcih of course has many solutions given by

$A_{i}=v_iv_i^*$.

The superposition of these solutions is simply a function of the matrix $H$, which I already know is a valid candidate for $A$. The question is whether this is necessarily the case.

I appreciate any advice.

3

There are 3 best solutions below

2
On

Example...
$$ K = \begin{bmatrix} 1 & 0\\0 &0 \end{bmatrix},\quad H = \begin{bmatrix} 0 & 1\\1 &0 \end{bmatrix},\quad A = \begin{bmatrix} 0 & 1\\0 &0 \end{bmatrix} . $$ Then $K,H$ are Hermitian and $K=AH$, but $A$ is not normal. Here $H$ is not positive.


Example where $H$ is positive (but $K$ is not):
$$ H = \begin{bmatrix} 1 & 0\\0 &2 \end{bmatrix},\quad K = \begin{bmatrix} 0 & 1\\1 &0 \end{bmatrix},\quad A = \begin{bmatrix} 0 & 1/2\\1 &0 \end{bmatrix} . $$

2
On

In general, you can have $a,b> 0$ distinct. Fix distinct $r,q\geqslant 0$ such that $ar=bq$, and note that this forces $r\neq q$. Let $$A=\begin{pmatrix}p\ & q \\ r & s\end{pmatrix},$$ $$H=\begin{pmatrix} a & 0 \\ 0 & b\end{pmatrix},$$ and let $$K=AH=\begin{pmatrix}ap & bq \\ ar & bs \end{pmatrix}.$$ Then $K$ is hermitian because we chose $ar=bq$. But $A$ is not normal, because the $1,1$ entry is of $AA^*$ is $p^2+q^2$, but the $1,1$ entry of $A^*A$ is $p^2+r^2$.

For concreteness, $$A=\begin{pmatrix} 0 & 2\\ 1& 0\end{pmatrix},$$ $$H=\begin{pmatrix} 2 & 0 \\ 0 & 1 \end{pmatrix},$$ $$K=\begin{pmatrix}0 & 2 \\ 2 & 0\end{pmatrix},$$ $$A^*A=\begin{pmatrix} 1& 0 \\ 0 & 4\end{pmatrix},$$ and $$A^*A=\begin{pmatrix}4 & 0 \\ 0 & 1\end{pmatrix}.$$

In this example, $H$ scales the $x$-axis by $2$ and the $y$-axis by $1$ (Hermitian). $A$ scales the $y$-axis by $2$ and then $x$-axis by $1$, and then reflects (switches $x,y$). So the composition $AH$ doubles both $x$ and $y$, then reflects (also Hermitian). $A^*$ scales the $x$-axis by $2$, the $y$-axis by $1$, and then reflects. So $A^*A$ both reflect twice. However, one doubles $x$, reflects, then doubles $y$ (which is the original $x$, now doubled twice), then reflects back. The other has the roles of $x,y$ reversed.

0
On

Proposed (partial) answer.

Let H have distinct positive eigenvalues. Then a basis for the space of matrices can be taken to be the set of outer products $v_iv_j^*$.

The null space of the operator $L(A)=H^2A - AH^2$ is spanned by the basis elements $B_{ij}=v_iv_i^*$. I believe that the other elements cannot be combined to create a null vector, so the only null vectors (i.e., matrices $A$ being sought) are linear combinations of the "diagonal" basis elements, and hence a function of $H$.

If a linear combination of the non-diagonal basis vectors $B_{ij}=v_iv_i^*$, $i\neq j$, say

$\sum_i^n c_{ij}B_{ij}$

was a null vector of the operator $L$, the result would contradict the linear independence of the basis vectors.

This partial answer relies on the relation $A^*H=HA$ which requires the conjugation function to be defined for $A$.