The eigenvalues of $A=E-I$, where $E$ is a square matrix made up entirely of $1$'s and where $I$ is the appropriate identity matrix.

163 Views Asked by At

Let $A=E-I$, where $E$ is a square matrix made up entirely of $1$'s and where $I$ is the appropriate identity matrix. The following regarding $A$ is stated in my notes, but I am not sure how to show it and convince myself that it is true.

"$E$ has a zero eigenvalue of algebraic multiplicity $n-1$ (why?), so A has an eigenvalue of $-1$ with algebraic multiplicity $n-1$. Since $A\mathbf e=(n-1)\mathbf e$, the remaining eigenvalue is $n-1$."

  1. How would you show that $E$ has an eigenvalue zero of algebraic multiplicity $n-1$? How does this then imply that $A$ has an eigenvalue of $-1$ with algebraic multiplicity $n-1$?

  2. Where does the last statement "$A\mathbf e=(n-1)\mathbf e$" come from? Clearly, from this it can be deduced that the remaining eigenvalue, $\lambda=n-1$, but where did the statement come from in the first place?

3

There are 3 best solutions below

0
On BEST ANSWER

The second identity $Ae = (n-1)e $ is a simple computation (do it!).

For the first question, note that $v_j = e_1 - e_j $ (with $j=2,\dots,n $) is in the kernel of $E $, where $e_i $ is the ith element of the standard basis. Since all these vectors are linearly independent, the dimension of the kernel of $E $ (the algebraic multiplicity of the eigenvalue $0$) is at least $n-1$. If it were $n $, the kernel would be everything, i.e. $E=0$. Since this is clearly not the case, the dimension is $n-1$.

To see what this implies about the algebraic multiplicity of the eigenvalue $-1$ of $A $, think about what happens for an eigenvector of $E $ if you apply $A $ to it (and vice versa)

0
On

$E$ is the outer product of the $\bf e$ with itself, therefore it must have rank 1 (linear combination of 1 and only 1 vector). Then by rank-nullity theorem everything else must be the nullspace. Subtracting $I$ lowers every eigenvalue by 1, so therefore the 0 becomes -1.

0
On

Here is a matrix $P$ that I made up some time ago. The columns are eigenvectors of your matrix $E.$ The columns are also eigenvectors of your matrix $E-I.$ Also, the columns are perpendicular to each other. The outcome is that $P^T A P$ is diagonal. Note that $P$ is not orthogonal. $$ P = \left( \begin{array}{rrrrrrrrrr} 1 & -1 & -1 & -1 & -1 & -1 & -1 & -1 & -1 & -1 \\ 1 & 1 & -1 & -1 & -1 & -1 & -1 & -1 & -1 & -1 \\ 1 & 0 & 2 & -1 & -1 & -1 & -1 & -1 & -1 & -1 \\ 1 & 0 & 0 & 3 & -1 & -1 & -1 & -1 & -1 & -1 \\ 1 & 0 & 0 & 0 & 4 & -1 & -1 & -1 & -1 & -1 \\ 1 & 0 & 0 & 0 & 0 & 5 & -1 & -1 & -1 & -1 \\ 1 & 0 & 0 & 0 & 0 & 0 & 6 & -1 & -1 & -1 \\ 1 & 0 & 0 & 0 & 0 & 0 & 0 & 7 & -1 & -1 \\ 1 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 8 & -1 \\ 1 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 9 \end{array} \right). $$

The columns of $P$ are of varying lengths; for the 10 by 10 case depicted, lengths $ \sqrt{10}, \sqrt{2}, \sqrt{6}, \sqrt{12},..$ All that is necessary to make an orthogonal matrix $Q$ out of this is to divide each column by its length.