Application of Matrix Diagonalization

1.3k Views Asked by At

I'm reading a book about inverse analysis and trying to figure out how the authors do the inversion.

Assume that matrix $C$ is $$ C ~=~ \begin{bmatrix} 88.53 & -33.60 & -5.33 \\ -33.60 & 15.44 & 2.67 \\ -5.33 & 2.67 & 0.48 \end{bmatrix} $$ and at some point authors diagonalize this matrix to calculate matrix $P$ using $$ C^{-1} ~=~ P^{\rm t} L P $$ where $L$ is a diagonal matrix of positive eigenvalues and the columns of $P$ are orthonormal eigenvectors.

The above equation for diagonalizing inverse of $C$ is a bit different from what is usually used and therefore I cannot calculate $P$ correctly (same as the book!). So, that would be great if somebody can show me the way to calculate $P$. $$ P ~=~ \begin{bmatrix} 0.93 & 0.36 & -0.03 \\ -0.36 & 0.90 & -0.23 \\ -0.06 & 0.23 & 0.97 \end{bmatrix} $$

2

There are 2 best solutions below

4
On

Have you heard of Jordan Normal Form

For your matrix:

$$C=\left(\begin{matrix} 88.53 & -33.60 &-5.33\\ -33.60 & 15.44 & 2.67\\ -5.33 & 2.67 & 0.48 \end{matrix}\right)$$

We would find the eigenvalues and the eigenvectors and then diagonalize it such that:

$$C = P J P^{-1}$$

For this matrix, we have:

$\lambda_1 = 101.976, v_1 = (0.929997, -0.362899, -0.0583849)$

$\lambda_2 = 2.47102 , v_2 = (0.366307, 0.901908, 0.228868)$

$\lambda_3 = 0.00312608 , v_3 = (-0.0303981, -0.234233, 0.971705)$

Using these values, we can now diagonalize the matrix $C$.

$$C=\left(\begin{matrix} 88.53 & -33.60 &-5.33\\ -33.60 & 15.44 & 2.67\\ -5.33 & 2.67 & 0.48 \end{matrix}\right) = P J P^{-1}$$

Where: $$P = \left(\begin{matrix} 0.929997 & 0.366307 & -0.0303981 \\ -0.362899 & 0.901908 & -0.234233 \\ -0.0583849 & 0.228868 & 0.971705 \end{matrix}\right)$$

$$J = \left(\begin{matrix} 101.976 & 0 & 0 \\ 0 & 2.47102 & 0 \\ 0 & 0 & 0.00312608 \end{matrix}\right)$$

$$P^{-1} = \left(\begin{matrix} 0.929997 & -0.362899 & -0.0583849 \\ 0.366307 & 0.901908 & 0.228868 \\ -0.0303981 & -0.234233 & 0.971705 \end{matrix}\right)$$

Notice that the columns of $P$ are from the linear combination of the eigenvectors $[v_1 | v_2 | v_3]$.

Notice that $J$ is the diagonal of the eigenvalues $\lambda_1, \lambda_2, \lambda_3$.

Note the $P^{-1}$ is just the inverse of $P$.

Lastly, you should understand that this works only when a matrix is diagonalizable (see site referenced above).

Regards

0
On

The spectral theorem ensures that since $C$ is symmetric it has 3 real eigenvalues and its eigenspaces are orthogonal. Let $\lambda_1,\lambda_2,\lambda_3$ be the eigenvalues and $\vec v_1,\vec v_2,\vec v_3$ be orthonormal eigenvectors (whose existence is granted by the spectral theorem; note that $\lambda_i$ need not be distinct, since you can always orthonormalize a basis with the Gram-Schmidt process). $\vec v_i$ are such that $C\vec v_i=\lambda_iv_i$ ($i=1,2,3$): in matrix notation this means $$ C \begin{bmatrix} ~\\ \vec v_1 & \vec v_2 & \vec v_3 \\~ \end{bmatrix} ~=~ \begin{bmatrix} ~\\ \vec v_1 & \vec v_2 & \vec v_3 \\~ \end{bmatrix} \begin{bmatrix} \lambda_1\\ & \lambda_2\\ && \lambda_3 \end{bmatrix} $$ where $\vec v\in\mathbb R^{3\times 1}$ is intended as a column vector. Therefore, setting $$ D = \begin{bmatrix} \lambda_1\\ & \lambda_2\\ && \lambda_3 \end{bmatrix} \quad\text{and}\quad P = \begin{bmatrix} ~\\ \vec v_1 & \vec v_2 & \vec v_3 \\~ \end{bmatrix} $$ you have that $C=PDP^{-1}$. Since the columns of $P$ are orthonormal, it follows that $P^{-1}=P^{\rm t}$ is the transpose of $P$, therefore $$ C=PDP^{\rm t} $$ Now, if you want to compute any power of $C$ (or just any integer power, if you work with real matrices and $\lambda_i$ are not all non-negative) you have that $$ C^n = PD^nP^{\rm t} $$ (since $P^{\rm t}P=\rm I$, the identity matrix). Chosing $n=-1$ you have $$ C^{-1} ~=~ P \begin{bmatrix} \lambda_1^{-1}\\ & \lambda_2^{-1}\\ && \lambda_3^{-1} \end{bmatrix} P^{\rm t} $$ so that $L=D^{-1}$.

To sum up, $P$ is the matrix whose $i$-th column is the eigenvector $\vec v_i$ with eigenvalue $\lambda_i$, where $\vec v_1,\vec v_2,\vec v_3$ are orthonormal.

(In your case, since you require $C^{-1}=P^{\rm t}LP$ rather than $PLP^{\rm t}$, $P$ is the matrix whose $i$-th row is the eigenvector $\vec v_i$, i.e. the transpose of the $P$ above.