Eigenvalue of (some) $ 4 \times 4 $ symmetric matrices

78 Views Asked by At

$$A=\pmatrix{ 0 & 3 & 2 & 0 \\ 3 & 0 & 0 & 2 \\ 2 & 0 & 0 & 3 \\ 0 & 2 & 3 & 0 \\ }$$

Is there a quicker way to compute eigenvalues of this matrix other than to do it the long way? And what are the strategies for similar matrices?

3

There are 3 best solutions below

2
On BEST ANSWER

Observe \begin{align} M= \begin{pmatrix} A & B\\ B & A \end{pmatrix} \end{align} where $B = 2I_2$ and $A=\begin{pmatrix}0 & 3\\ 3 & 0 \end{pmatrix}$ and $A$ & $B$ commute. Then we see that \begin{align} \det\left(M-\lambda I_4 \right) = \det ((A-\lambda I_2)^2-B^2) \end{align} where we used the determinant formula for block matrices. Note that \begin{align} (A-\lambda I_2)^2= \begin{pmatrix} \lambda^2+9 & -6\lambda\\ -6\lambda & \lambda^2+9 \end{pmatrix} \end{align} which means \begin{align} \det (M-\lambda I_4) = (\lambda^2+5)^2-36\lambda^2 = (\lambda^2-1^2).(\lambda^2-5^2) \end{align}

0
On

I don't have a general strategy.

But here, all the rows sum the same, so $(1,1,1,1)$ is an eigenvector for $\lambda=5$. Similarly, the alternating sums of the rows are $1$ and $-1$, and so $(1,-1,1,-1)$ is an eigenvector for $\lambda=-1$.

With similar ideas we see that $(1,1,-1,-1)$ is an eigenevector for $\lambda=1$.

If we don't have the imagination to find the last eigenvector/eigenvalue, we may notice that the trace is zero, so the last eigenvalues is $\lambda=-5$. The eigenvector is $(1,-1,-1,1)$.

0
On

We can look at the matrices $A=\begin{bmatrix}0&1&0&0\\1&0&0&0\\0&0&0&1\\0&0&1&0\end{bmatrix}$ and $B=\begin{bmatrix}0&0&1&0\\0&0&0&1\\1&0&0&0\\0&1&0&0\end{bmatrix}$. Each of these can be split into two $\begin{bmatrix}0&1\\1&0\end{bmatrix}$ reflection blocks, so they have eigenvalues $1$ and $-1$ each with multiplicity two. Specifically, the eigenvectors are $\begin{bmatrix}1\\1\\0\\0\end{bmatrix}$ and $\begin{bmatrix}0\\0\\1\\1\end{bmatrix}$ for $A$ and the eigenvalue $1$, $\begin{bmatrix}1\\-1\\0\\0\end{bmatrix}$ and $\begin{bmatrix}0\\0\\1\\-1\end{bmatrix}$ for $A$ and the eigenvalue $-1$, $\begin{bmatrix}1\\0\\1\\0\end{bmatrix}$ and $\begin{bmatrix}0\\1\\0\\1\end{bmatrix}$ for $B$ and the eigenvalue $1$, and finally $\begin{bmatrix}1\\0\\-1\\0\end{bmatrix}$ and $\begin{bmatrix}0\\1\\0\\-1\end{bmatrix}$ for $B$ and the eigenvalue $-1$. Linear combinations of each pair will also work, of course, such as $\begin{bmatrix}1\\1\\1\\1\end{bmatrix}$ for the eigenvalue $1$ in both $A$ and $B$, or $\begin{bmatrix}1\\-1\\-1\\1\end{bmatrix}$ for the eigenvalue $-1$ in both $A$ and $B$.

The matrix we care about is a linear combination $M=3A+2B$, so the two common eigenvectors $\begin{bmatrix}1\\1\\1\\1\end{bmatrix}$ and $\begin{bmatrix}1\\-1\\-1\\1\end{bmatrix}$ we found are still eigenvectors, with eigenvalues $3+2=5$ and $-3-2=-5$ respectively. We need two more - and it turns out we can get more out of our eigenvector lists. $\begin{bmatrix}1\\1\\-1\\-1\end{bmatrix}$ is an eigenvector for $A$ with eigenvalue $1$, and an eigenvector for $B$ with eigenvalue $-1$, so it's an eigenvector for $M$ with eigenvalue $3-2=1$. Similarly, $\begin{bmatrix}1\\-1\\1\\-1\end{bmatrix}$ is an eigenvector for $A$ with eigenvalue $-1$ and for $B$ with eigenvalue $1$, so it's an eigenvector for $M$ with eigenvalue $-3+2=-1$.

We have found four linearly independent eigenvectors, with four different eigenvalues. As $M$ is a $4\times 4$ matrix, that's everything.

The strategy implied here isn't something that will work very reliably, but when it does and we can decompose a matrix into pieces with known eigenvectors, we can do a lot.