Eigenvalues from Symmetric Matrix

4.5k Views Asked by At

I found a past final exam from here

Consider the real matrix

$ A = \left[ {\begin{array}{ccc} 1 & 0 & 1 \\ 0 & 1 & 0 \\ 1 & 0 & 1 \\ \end{array} } \right] $

(a) Explain what property the matrix A has, which assures that you can diagonalize it without the help of complex matrix.

(b) find a real matrix S and a diagonal matrix D, such that

$A=SDS^T$

For part (a), I thought the matrix is symmetric, but I am not sure how to find the eigenvalue for that..

For part (b), how do I get the eigenvalue from A?

2

There are 2 best solutions below

0
On BEST ANSWER

Okay I figured it out and here's the thought process:

(1) If you notice the matrix is symmetrical, we can try to come up with an eigenvalue that can make the diagonal to all 0s. So we can try eigenvalue = 1, which makes the new matrix into : $ \left[ {\begin{array}{ccc} 0 & 0 & 1 \\ 0 & 0 & 0 \\ 1 & 0 & 0 \\ \end{array} } \right] $

Therefore we are able to construct the corresponding eigenvector into $\left[ {\begin{array}{c} 0 \\ 1 \\ 0 \\ \end{array} } \right] $

(2) You could also make eigenvalue equals to 0, so we can have a new matrix of $ \left[ {\begin{array}{ccc} 1 & 0 & 1 \\ 0 & 1 & 0 \\ 1 & 0 & 1 \\ \end{array} } \right] $ So we can have a new eigenvector of $\left[ {\begin{array}{c} -1 \\ 0 \\ 1 \\ \end{array} } \right] $

(3) Since we got eigenvalues of 0 and 1, according to the trace of the matrix, it is 1 + 1 + 1 = 3, so we can find the other eigenvalue of 3 - 1 - 0 = 2.

In conclusion: the 3 eigenvalues are: 1, 0 and 2

0
On

Recall that the columns of a transformation matrix are the images of the basis and that when you right-multiply a matrix by a vector, the result is a linear combination of the columns of the matrix with coefficients given by the components of the vector.

The second column of $A$ is $(0,1,0)^T$, so that standard basis vector gets mapped to itself: it is an eigenvector of $1$. The sum of the first and third columns is $(2,0,2)^T=2(1,0,1)^T$, so $(1,0,1)$ is an eigenvector of $2$. Since the sum of the eigenvalues is equal to the trace, you get the third eigenvalue for free: it’s $1+1+1-1-2=0$, but then, we already knew that $0$ is an eigenvalue because the matrix has two identical columns, therefore has a nontrivial null space. You can either compute a basis for this null space to find an eigenvector of $0$ or notice that because the first and third columns are identical, their difference, i.e., the product of the matrix with $(1,0,-1)^T$, is $0$.

Since the problem wants an orthogonal diagonalization—$SDS^T$ instead of $SDS^{-1}$—you’ll need to orthonormalize the above eigenvectors. They’re already mutually orthogonal (as eigenvectors of a symmetric real matrix of different eigenvalues are), so all you need to do is normalize them.