Diagonalize real symmetric matrix

1k Views Asked by At

I believe that for any real symmetric matrix $A$, I am always be able to write in the eigen decomposition form

$$A = P\Lambda P^T \tag{1}$$

whatever the rank of $A$. (see Real symmetric matrix is never defective and Real symmetric matrix decomposition).

But for the matrix

$A=$ \begin{bmatrix} 0.375 & 0 & -0.125 & 0 &-0.25 & 0 \\ 0 &0.5 & 0 & -0.25 & 0 & -0.25\\ -0.125 & 0 & 0.375 & 0 & -0.25 & 0\\ 0 & -0.25 & 0 & 0.5 & 0 & -0.25\\ -0.25 & 0 &-0.25 & 0 & 0.5 & 0\\ 0 & -0.25 & 0 & -0.25 & 0 & 0.5\\ \end{bmatrix}

(It is clear that $\det(A)=0$ and $\mathrm{rank}(A)=4$) and the Matlab function [P,D] = eig(A) gives $A=P \Lambda P^{-1}$ but $P$ is not orthogonal, i.e. $P^T \neq P^{-1}$.

where $P=$ \begin{bmatrix} 0.5774 & -0.7071 & -0.4082 & -0.0939 & -0.1850 & -0.0074\\ -0.0000 & -0.0000 & 0.0000 & 0.7945 & -0.5469 & -0.0473\\ 0.5774 & 0.7071 & -0.4082 & -0.0939 & -0.1850 & -0.0074\\ 0 & 0 & 0 & -0.4084 & -0.5469 & -0.6822\\ 0.5774 & 0.0000 & 0.8165 & 0.1878 & -0.1850 & 0.0149\\ 0 & 0 & 0 & -0.3861 & -0.5469 & 0.7294\\ \end{bmatrix}

$D=$ \begin{bmatrix} 0 & 0 & 0 & 0 & 0 & 0\\ 0 & 0.5 & 0 & 0 & 0 & 0\\ 0 & 0 & 0.75 & 0 & 0 & 0\\ 0 & 0 & 0 & 0.75 & 0 & 0\\ 0 & 0 & 0 & 0 & 0 & 0\\ 0 & 0 & 0 & 0 & 0 & 0.75\\ \end{bmatrix}

My question is that Am I misunderstanding the decomposition of real symmetric matrix in (1), or is it a bug of eig() function of Matlab ?

2

There are 2 best solutions below

0
On BEST ANSWER

For me it is rather a numerical issue.

The matrix $A$ in your question should have no issue to be diagonalized to the form $A=P DP^T$.

But if we add some numerical error like $$AA=(A(I+AA^T))(I+AA^T)^{-1}=A$$

The [P,D]=eig(A) and [p,d]=eig(AA) will produce two different answers that only $AA$ satisfies what Matlab claimed, i.e. $AA=p D p^T$.

Tested on Matlab R2017a win64.

1
On

For a symmetric matrix, eigenvectors corresponding to different eigenvalues are orthogonal.

But, in this case, you have repeated eigenvalues, and Matlab has made no effort to give you orthonormal eigenvectors for the repeated eigenvalues (it doesn't claim it should, as far as I can tell). You can see it by noticing that the first and fifth columns (corresponding to the eigenvalue $0$) are not orthonormal. Nor the third and fourth, which correspond to $0.75$).

The eigenvectors are not normalized, either.

To obtain an orthogonal $P$ you would have to do Gram-Schmidt between the first and fifth columns, and between the third, fourth, and sixth columns. You would also have to normalize the second column (for the eigenvalue $1/2$).

Edit: here is a simple example to follow the situation with easier numbers. Consider $$ A=\begin{bmatrix} 1&0&0\\0&2&0\\0&0&2\end{bmatrix}. $$ Then eigenvalues are $1$ and $2$, the second with multiplicity two. Here are three linearly independent eigenvectors for this matrix: $$ \begin{bmatrix} 5\\0\\0\end{bmatrix},\ \ \begin{bmatrix} 0\\7\\8\end{bmatrix},\ \ \begin{bmatrix} 0\\ -1\\7/8\end{bmatrix}. $$ But the matrix $$ \begin{bmatrix} 5&0&0\\ 0&7&-1\\ 0&8&7/8\end{bmatrix}, $$ analog to what Maple gave you, is far from orthogonal; yet, we still have $P^{-1}AP=A$ (in this case $\Lambda=A$, since $A$ is already diagonal). We would obtain an orthogonal matrix if we normalize the first column (so the $5$ becomes a $1$), and do Gram-Schmidt on the second an third, to get $$ Q=\begin{bmatrix} 1&0&0\\0&7/\sqrt{113}&-8/\sqrt{113}\\ 0&8/\sqrt{113}&7/\sqrt{113}\end{bmatrix}. $$ This one is orthonormal and still satisfies $Q^{-1}AQ=\Lambda=A$. Of course instead of Gram-Schmidt, we could have taken a nicer basis of the eigenspace of $2$, and instead get $$ R=\begin{bmatrix} 1&0&0\\0&1&0\\0&0&1\end{bmatrix}, $$ which trivially satisfies $R^{-1}AR=A$.