Eigenvalues for $4\times 4$ matrix

37.2k Views Asked by At

Show that $0,2,4$ are the eigenvalues for the matrix $A$: $$A=\pmatrix{ 2 & -1 & -1 & 0 \\ -1 & 3 & -1 & -1 \\ -1 & -1 & 3 & -1 \\ 0 & -1 & -1 & 2 \\ }$$ and conclude that $0,2,4$ are the only eigenvalues for $A$.


I know that you can find the eigenvalues by finding the $\det(A-\lambda \cdot I)$, but it seems to me that the computation will be rather difficult to compute as it is a $4 \times 4$ matrix.

My question: is there an easier method to calculate the eigenvalues of $A$? And if I have to conclude that these are the only eigenvalues, is there a theorem that argues how many eigenvalues a matrix can have?

7

There are 7 best solutions below

3
On BEST ANSWER

$A$ has zero row sums. Therefore the all-one vector $\mathbf e$ is an eigenvector of $A$ for the eigenvalue $0$. Since $A$ is also real symmetric, $\mathbf e$ can be extended to an orthogonal eigenbasis $\{\mathbf u,\mathbf v,\mathbf w,\mathbf e\}$ of $A$. But this is also an eigenbasis of $A+\mathbf e\mathbf e^\top$. Hence the spectrum of $A$ is $\{a,b,c,0\}$ if and only if the spectrum of $A+\mathbf e\mathbf e^\top$ is $\{a,b,c,\|\mathbf e\|^2\}=\{a,b,c,4\}$. It is easy to see that four eigenvalues of $$ A+\mathbf e\mathbf e^\top=\pmatrix{3&0&0&1\\ 0&4&0&0\\ 0&0&4&0\\ 1&0&0&3} $$ are $2,4,4,4$. Therefore the eigenvalues of $A$ are $2,4,4,0$.

8
On
  1. $\lambda=2$ is obviously an eigenvalue: look at the first and the last columns of $A-2I$.
  2. $\lambda=0$ is also en eigenvalue: sum up all the columns of $A$ (dependent?).
  3. Try to do similar trick for $\lambda=4$.
  4. The last eigenvalue: The sum of all eigenvalues is the trace, so if you have three of them, the last comes for free.(c) comment by Lord Shark the Unknown.

EDIT: If a calculation of eigenvalues is of your primal curiosity then you can do the following: \begin{align} \det(\lambda I-A)&=\begin{vmatrix}\lambda-2 & 1 & 1 & 0\\1 & \lambda-3 & 1 & 1\\1 & 1 & \lambda-3 & 1\\0 & 1 & 1 & \lambda-2\end{vmatrix}\stackrel{(1)}{=} \begin{vmatrix}\lambda & 1 & 1 & 0\\\lambda & \lambda-3 & 1 & 1\\\lambda & 1 & \lambda-3 & 1\\\lambda & 1 & 1 & \lambda-2\end{vmatrix}\\ &\stackrel{(2)}{=}\lambda\begin{vmatrix}1 & 1 & 1 & 0\\1 & \lambda-3 & 1 & 1\\1 & 1 & \lambda-3 & 1\\1 & 1 & 1 & \lambda-2\end{vmatrix}\stackrel{(3)}{=} \lambda\begin{vmatrix}1 & 1 & 1 & 0\\0 & \lambda-4 & 0 & 1\\0 & 0 & \lambda-4 & 1\\0 & 0 & 0 & \lambda-2\end{vmatrix}. \end{align} Explanations:

(1): add all columns to the first one,

(2): factor out $\lambda$ in the first column,

(3): eliminate in the first column.

4
On

It helps to notice that this matrix comes from a graph. It is the so called Laplacian of a graph. Notice the position of $-1$'s. If you take four points and join the $i$th and $j$th if there is a $-1$ at position $(i,j)$ you get a graph. It's fairly symmetric: its edges are the sides of a square and one diagonal. Consider different ways to associate a number to the vertices of the square. From one such distribution of numbers, you can produce another distribution as follows: at each vertex, the new number will be (old number) x (degree of the vertex) - (sum of numbers associated to the vertices that are joined with this one).

It may happen that the new distribution of numbers is proportional to the old one. So the old distribution is an eigenvector, and the constant of proportionality is an eigenvalue.

Suppose that you start with a distribution with all numbers equal ( say $1$). Then the new distribution is $0$ everywhere. So we got an eigenvector with eigenvalue $0$

Say we place $1$ at the ends of the (joined ) diagonal, and $-1$ at the other two points. Check that we get an eigenvector for eigenvalue $4$.

enter image description here

Say we place $1$, $-1$ at the diagonal, $0$, $0$ at the others. Check that we get an eigenvector with eigenvalue $4$.

Say we place $0$ at the diagonal, $-1$, $1$ at the other points. Check that we get eigenvector with eigenvalue $2$.

Note that this is a small example where it can be done by hand.

0
On

Perhaps there is no easier way of explaining how to find the eigenvalues of $A$. At the end, it all comes down to how much you are familiar with matrices that have some special properties. I’m using the very same insight as orangeskid but explained in a different way.

Consider this matrix: $$\color{red}{L=\left[\begin{array}{r} 1&-1\\ -1&1 \end{array}\right]}$$ which has eigenvalues $\color{red}{0}$ and $\color{red}{2}$. Now consider this bigger matrix $$M=\left[\begin{array}{r|rr|r} \color{red}{1}&0&0&\color{red}{-1}\\ \hline 0&\color{blue}{0}&\color{blue}{0}&0\\ 0&\color{blue}{0}&\color{blue}{0}&0\\ \hline \color{red}{-1}&0&0&\color{red}{1} \end{array}\right]$$ which is blockwise similar to the latter, and has eigenvalues $\color{blue}{0}$, $\color{blue}{0}$, $\color{red}{0}$ and $\color{red}{2}$ (the former two because of the $\color{blue}{2\times2}$ $\color{blue}{\text{zero matrix in the middle}}$; the latter two because of reusing $\color{red}{\text{the entries of the smaller matrix in the same fashion}}$.)

Now consider this matrix $$\color{purple}{K=\left[\begin{array}{r} 3&-1&-1&-1\\ -1&3&-1&-1\\ -1&-1&3&-1\\ -1&-1&-1&3\\ \end{array}\right]}$$ which is of the form $\color{purple}{4 \, I_4 -\boldsymbol{1}^{\rm T}_4 \boldsymbol{1}_4}$ and has eigenvalues $\color{purple}{0}$, $\color{purple}{4}$, $\color{purple}{4}$ and $\color{purple}{4}$. Now note that all matrices $L$, $M$, $A$ and $K$ are Laplacian matrices corresponding to undirected, unweighted simple graphs, and since $$M+A=K,$$ there is a result for Laplacian matrices that relates the sorted eigenvalues of $M$ and $A$ with those of $K$. For starters, the first eigenvalue of $A$ is $\bbox[yellow]{0}$. Then, the other three eigenvalues of $A$ are computed in the following fashion:

Take the eigenvalues of $K$ and remove a zero. Take the decreasingly sorted eigenvalues of $M$ and remove a zero. Then their difference gives the eigenvalues of $A$ other that the first zero eigenvalue we mentioned.

Thus, $$\color{purple}{4}-\color{red}{2}=\bbox[yellow]{2}, \qquad\qquad \color{purple}{4}-\color{red}{0}=\bbox[yellow]{4}, \qquad\qquad \text{and } \color{purple}{4}-\color{blue}{0}=\bbox[yellow]{4},$$ are the other three eigenvalues of $A$.

4
On

You would be correct about the determinant method of finding the eigenvalues being potentially difficult since the degree is 4 (though in this case where the roots are rational, they can be found easily).

But note that finding the eigenvalues is NOT what you've been asked to do. You are given three of them, and have only to verify that they are indeed eigenvalues. Simply compute the characteristic polynomial for each of the three values and show that it is $0$.

To show that they are the only eigenvalues, divide the characteristic polynomial by $x$, the result by $x-2$, and finally by $x-4$. You will find that the remaining polynomial is another factor of $x-4$.

0
On

Let $ \lambda_1 \le \lambda_2 \le \lambda_3 \le \lambda_4 $ be the eigenvalues of this matrix.

Since this is a Laplacian matrix, the smallest eigenvalue is $\lambda_1 = 0$.

The second smallest eigenvalue of a Laplacian matrix is the algebraic connectivity of the graph. This is bounded above by the traditional connectivity of the graph, so $\lambda_2 \le 2$.

From the trace, $8 \le \lambda_3 + \lambda_4 \le 2\lambda_4 \rightarrow \lambda_4 \ge 4$.

From Brouwer and Haemers (2011) Proposition 3.9.3, $\lambda_4 \le 1 + \underset{x}{\rm{max}}\;d_x=4$.

Thus, $\lambda_2=2, \lambda_3=4, \lambda_4=4$.

1
On

You can easily guess some eigenvectors: $$ \begin{pmatrix} 2 & -1 & -1 & 0 \\ -1 & 3 & -1 & -1 \\ -1 & -1 & 3 & -1 \\ 0 & -1 & -1 & 2 \end{pmatrix} \begin{pmatrix} 1\\1\\1\\1 \end{pmatrix} = \begin{pmatrix} 0\\0\\0\\0 \end{pmatrix} \\ \begin{pmatrix} 2 & -1 & -1 & 0 \\ -1 & 3 & -1 & -1 \\ -1 & -1 & 3 & -1 \\ 0 & -1 & -1 & 2 \end{pmatrix} \begin{pmatrix} 1\\0\\0\\-1 \end{pmatrix} = \begin{pmatrix} 2\\0\\0\\-2 \end{pmatrix} \\ \begin{pmatrix} 2 & -1 & -1 & 0 \\ -1 & 3 & -1 & -1 \\ -1 & -1 & 3 & -1 \\ 0 & -1 & -1 & 2 \end{pmatrix} \begin{pmatrix} 1\\-1\\-1\\1 \end{pmatrix} = \begin{pmatrix} 4\\-4\\-4\\4 \end{pmatrix} \\ \begin{pmatrix} 2 & -1 & -1 & 0 \\ -1 & 3 & -1 & -1 \\ -1 & -1 & 3 & -1 \\ 0 & -1 & -1 & 2 \end{pmatrix} \begin{pmatrix} 0\\1\\-1\\0 \end{pmatrix} = \begin{pmatrix} 0\\4\\-4\\0 \end{pmatrix} $$ The corresponding eigenvalues are $0,2,4,4$. It is not hard to check that the latter two vectors are linearly independent, and so these are all eigenvalues of the matrix.