Two questions about diagonalization

1k Views Asked by At
  1. Let A = $\begin{bmatrix}1 & 1 & 4\\0 & 3 & -4\\0&0&-1\end{bmatrix}$. Is the matrix A diagonalizable? If so find a matrix P that diagonalizes A. Can you write A as a linear combination of rank 1 matrices formed from its eigenvectors? Determine the eigendecomposition $A = PΛP^{-1}$.

  2. A = $\begin{bmatrix}3 & 1 & 0\\1 & 2 &1\\0&1&3\end{bmatrix}$. Is the matrix A diagonalizable? If so find a matrix P that orthogonally diagonalizes A. Can you write A as a linear combination of rank 1 matrices formed from its eigenvectors? (Note that A is real symmetric so that you do not have to compute the inverse of P).

My solution attempts:

  1. det(λI - A) = $\begin{vmatrix}λ - 1 & -1 & -4\\0 & λ - 3 &4\\0&0&λ + 1\end{vmatrix}$ = (λ -1) (λ -3) (λ + 1) = 0

$λ_{1}$ = 1, $λ_{2}$ = 3, $λ_{3}$ = -1

($λ_{1}I - A)x_{1}$ = 0

$\begin{bmatrix}0 & -1 & -4\\0 & -2 &4\\0&0&2\end{bmatrix}$ $\begin{bmatrix}x_{11}\\x_{12}\\x_{13}\end{bmatrix}$ = $\begin{bmatrix}0\\0 \\0\end{bmatrix}$

$\begin{bmatrix}x_{11}\\x_{12}\\x_{13}\end{bmatrix}$ = $x_{11} \begin{bmatrix}1\\0 \\0\end{bmatrix}$

It follows that $\begin{bmatrix}1\\0 \\0\end{bmatrix}$ is a basis for the eigenspace of A corresponding to $λ_{1}$ = 1.

In a similar way, I found other two basis vectors as $\begin{bmatrix}1/2\\1 \\0\end{bmatrix}$ and $\begin{bmatrix}-2/5\\1 \\1\end{bmatrix}$, and I got $P = \begin{bmatrix}1 & 1/2 & -2/5\\0 & 1 & 1\\0&0&1\end{bmatrix}$. Then my $P^{-1} = \begin{bmatrix}1 & -1/2 & 9/10\\0 & 1 & -1\\0&0&1\end{bmatrix}$. And I said A's left-multiplication by $P^{-1}$ and right-multiplication by $P (P^{-1}AP)$ gives the diagonal matrix Λ that is similar to A. I found Λ as $\begin{bmatrix}1 & 0 & 0\\0 & 3 & 0\\0&0&21/5\end{bmatrix}$. Then I said that if we multiply both sides of the equation $P^{-1}AP = Λ$ by $P$ from the left and by $P^{-1}$ from the right we get $PP^{-1}APP^{-1} = PΛP^{-1} = A$ and that this is the eigendecomposition of A. Here is my first question: Do you think I understood and did correctly what the asker wanted me to do by saying "Determine the eigendecomposition $A = PΛP^{-1}$."?

  1. I started in the same manner and I got three eigenvectors as

    $\begin{bmatrix}x_{11}\\x_{12}\\x_{13}\end{bmatrix}$ = $x_{12} \begin{bmatrix}0\\1 \\0\end{bmatrix}$,

    $\begin{bmatrix}x_{21}\\x_{22}\\x_{23}\end{bmatrix}$ = $x_{21} \begin{bmatrix}1\\1 \\1\end{bmatrix}$ and

    $\begin{bmatrix}x_{31}\\x_{32}\\x_{33}\end{bmatrix}$ = $x_{33} \begin{bmatrix}1\\-2 \\1\end{bmatrix}$.

I said A is not diagonalizable since the eigenvectors are not linearly independent.

Here is my second question: What the hell does the sentence "Can you write A as a linear combination of rank 1 matrices formed from its eigenvectors?" mean?

1

There are 1 best solutions below

7
On BEST ANSWER

A complete answer would be quite involved here; maybe I'll have the time later. However, I just wanted to address that last question about the combination of rank-1 matrices:

Note that the second matrix $A$ is symmetric, which means that we'll be able to diagonalize it orthogonally. That is, we can write $A = UDU^T$ where $D$ is diagonal and $U$ is orthogonal so that $U^{-1} = U^T$ and, equivalently, the columns of $U$ are an orthonormal basis consisting of eigenvectors of $A$.

So, we can write $$ A = \pmatrix{u_1&u_2&u_3}\pmatrix{\lambda_1\\&\lambda_2\\&&\lambda_3} \pmatrix{u_1^T\\u_2^T\\u_3^T} $$ with columns $u_i \in \Bbb R^n$ forming an orthonormal basis and $\lambda_i \in \Bbb R$. By block matrix multiplication, we can rewrite this as $$ A = \lambda_1 u_1 u_1^T + \lambda_2 u_2u_2^T + \lambda_3 u_3 u_3^T $$ That is, we can rewrite $A$ as a linear combination of the rank $1$ matrices formed from its eigenvectors.