Simultaneous Diagonalization of Symmetric Positive Semidefinite matrices

2.8k Views Asked by At

I'm trying to find a proof of a supposedly simple property of symmetric positive definite matrices concerning simultaneous diagonalization.

Let $A$ and $B$ be real symmetric matrices, with A positive definite and B positive semidefinite. Call $C=A^{-1}B$.

I know from e.g. Theorem 4.5.15 in Horn-Johnson's Matrix Analysis that there exists a non-singular matrix $S$ such that $SAS^T$ and $SBS^T$ are diagonal if and only if there exists a non-singular matrix $R$ such that $R^{-1}CR$ is diagonal and $C$ is diagonalizable. In the problem I'm working on, I can suppose that $C$ satisfies these properties, then I'm sure that the matrix $S$ indeed exists.

Moreover, from Theorem 12.19 in Laub's Matrix Analysis for Scientist and Engineers I know that there exists a non-singular matrix $S$ such that $S^TAS=1$ and $S^TBS=D$, with D a diagonal matrix whose diagonal elements are the eigenvalues of $C$.

I specify that these theorems work with weaker hypotheses than mine: Horn-Johnson's one deals with symmetric $A$ and $B$, Laub's further assumes $A$ to be positive definite, with no remarks on $B$.

Now, is it possible to prove that in the situation I'm working on it is $S=R$? Maybe it is a stupid question, but I cannot prove this result in general. The only thing I can derive is that $R^{-1}SDS^{-1}R=D$, but I have examples that make me think that $S=R$ is a general property, while I'm not able to show why. I only suppose that this can be proven if one includes the positive-semidefiniteness of $B$ in the hypoteses of e.g. Laub's theorem.

Thank you all in advance for your help.

1

There are 1 best solutions below

1
On BEST ANSWER

Your guess of $S=R$ is right if $A$ is positive definite and $B$ is symmetric.

Eigen decomposition of $A$: $A = Q \Lambda Q^T$ with orthogonal $Q$ and diagonal $\Lambda>0$.

We obtain $A = P\,P^T$ with $P:=Q\Lambda^{-\frac12}$.

The matrix $P^T B P$ is symmetric. Therefore, it admits an eigenvalue decomposition $D = V^T P^T B P V$ with diagonal $D$ and orthogonal $V$.

We have $1 = V^T \underbrace{P^T A P}_{=1} V$ as required.

Thus $S:= PV = Q\Lambda^{-\frac12}V$ is the first wanted transformation matrix.

Now we test whether we can use $R:=S$ for $R^{-1} A^{-1} B R \overset{?}{=} D$.

\begin{align*} S^{-1}A^{-1}BS &= \underbrace{V^T \Lambda^{\frac12} Q^T}_{S^{-1}} \underbrace{Q\Lambda^{-1} Q^T}_{A^{-1}}\, B\, \underbrace{Q \Lambda^{-\frac12} V}_{S}\\ &= V^T \underbrace{\Lambda^{-\frac12}Q^T}_{P^{T}} B \underbrace{Q \Lambda^{-\frac12}}_{P} V\\ &= D \end{align*}