How unique is the orthogonal diagonalization of a real symmetric matrix, if we don't change the diagonal matrix of eigenvalues (no permutaiton)?

62 Views Asked by At

Let $A$ be a real symmetric $n\times n$ matrix, so it's orthogonally diagonalizable, i.e. there is $P\in O(n)$ so that $P^{-1}AP=P^{T}AP=D$(diagonal).

I'm asking myself: how unique can $P$ be? Here are my thoughts, but any theorem stating such degree of uniqueness will be appreciated.

  1. If we take $A=\lambda I_n,\lambda \in \mathbb{R}$ then any $P\in O(n)$ orthogonally diagonalizes $A.$

  2. In this related question, the author gave an example of two different orthogonal matrices

$I_2:=$ $ \begin{pmatrix} 1 & 0 \\ 0 & 1 \\ \end{pmatrix} $ and another one

and

$ \begin{pmatrix} 1 & 0 \\ 0 & -1 \\ \end{pmatrix} $

both of which are orthogonal and satisfies:

enter image description here

  1. If we assume that there are two such (diagonalizing) orthogonal matrices $P,Q\in O(n),$ so that $P^{-1}AP=P^{T}AP=Q^{-1}AQ=Q^{T}AQ=D,$ then what can we say the relation between $P,Q$ is? If one thinks of how the diagonalizing matrices (for any matrix $A$, not just the symmetric ones) are constructed, then we can see that we stack the $n$ eigenvectors one after another to construct them. So this means, we can multiple any such eigenvector by a non-zero scalar in the general (non-symmetric) case to obtain yet another diagonalizing matrix. But if the diagonalizing matrix in question is indeed an orthogonal matrix, we can't multiply by any non-zero scalar since it won't preserve the norm (one) of the eigenvector: so this scalar must be $\pm 1.$

  2. Since $ \begin{pmatrix} 1 & 0 \\ 0 & -1 \\ \end{pmatrix} $

commutes with any $2\times 2$ diagonal matrix, it's clear that if $P$ is a $2\times 2$ orthogonal diagonalizing matrix so that $P^{-1}AP=P^{T}AP=D,$ then so are

$ \begin{pmatrix} 1 & 0 \\ 0 & -1 \\ \end{pmatrix} P $ and

$ \begin{pmatrix} -1 & 0 \\ 0 & 1 \\ \end{pmatrix}P $

and for $n\times n$ cases, changing a column or row of $P$ by a sign produces yet another orthogonal diagonalizing matrix.

So based on the above, my guess is that: if for two different diagonalizing orthogonal matrices $P,Q$ we have: $P^{-1}AP=P^{T}AP=Q^{-1}AQ=Q^{T}AQ=D,$ then $P,Q$ must differ by each other by at most $n$ sign changes in columns, at least in most cases (vague, I know!) But 1) above (when $A=\lambda I_n$) contradicts it. So what can we say about the relation between $P,Q?$

1

There are 1 best solutions below

2
On

If $A$ has not any repeated eigenvalue, i.e., if all eigenvalues of $A$ are simple, each eigenspace of $A$ is one-dimensional and for each eigenvalue, there are only two choices of a unit eigenvector. It follows that there are $2^n$ choices of orthonormal eigenbasis when the order of the eigenvalues is prescribed, and $2^nn!$ choices of orthonormal basis otherwise.

In other words, if the arrangement of diagonal elements in $D$ is fixed, there are $2^n$ orthogonal matrices $Q$ such that $A=QDQ^T$, and any two such diagonalisations $Q_1DQ_1^T=Q_2DQ_2^T$ are related by $Q_2=Q_1S$ for some diagonal matrix $S$ whose diagonal elements are $\pm1$.

If the order of diagonal elements in $D$ is not fixed, $A$ has $2^nn!$ orthogonal diagonalisations, and any two such diagonalisations $Q_1D_1Q_1^T=Q_2D_2Q_2^T$ are related by $Q_2=Q_1SP$ and $D_2=P^TD_1P$ for some permutation matrix $P$ and for some diagonal matrix $S$ whose diagonal elements are $\pm1$.

If, instead, $A$ has a repeated eigenvalue, it has infinitely many orthogonal diagonalisations. For instance, when $A=\operatorname{diag}(5,5,-7)$, then $QAQ^T$ is an orthogonal diagonalisation of $A$ for every orthogonal matrix $Q$ of the form $\pmatrix{U_{2\times2}\\ &1}$.