Every vector with sum of elemtents equal to zero is eigenvector of symmetric matrix whose rows and columns sum is equal?

1.2k Views Asked by At

Suppose there is a symmetric matrix such that the sum of elements in every row equals the sum of elements in every column. One example of such a matrix is:

\begin{equation*} A = \begin{bmatrix} a & b & b \\ b & a & b \\ b & b & a \end{bmatrix} \end{equation*}

I have noticed that every vector whose elements sum up to zero is an eigenvector of a matrix satisfying the above criteria. How to prove this and what would be the way to tell what the eigenvalue corresponding to such an eigenvector is?

Thank you!

2

There are 2 best solutions below

1
On BEST ANSWER

The claim is false. Maybe these examples are clearer than Gae. S's. Let $$A=\pmatrix{0&0&1&0\\0&0&0&1\\1&0&0&0\\0&1&0&0},\quad B=\pmatrix{1&0&1&0\\0&1&0&1\\1&0&1&0\\0&1&0&1}, \quad\text{ and }\quad v=\pmatrix{1\\-1\\0\\0}.$$ It is easy to check that the matrices $A$ and $B$ are symmetric with all row and column sums equal and that the elements of vector $v$ sum up to zero. But, contrary to what the OP asserts, $v$ is not an eigenvector of $A$ or of $B$.

In general, if $A$ is symmetric and has all row sums equal to $s$, then the vector of all $1$s is an eigenvector of $A$, with eigenvalue $s$. If $v$ is an eigenvector corresponding to a different eigenvalue, then the sum of the entries in $v$ vanishes.

2
On

$\newcommand{gae}[1]{\newcommand{#1}{\operatorname{#1}}}\gae{rk}$The claim is false for $n\ge 3$. For $n=3$, the space $V$ of symmetric $3\times 3$ matrices with constant sum of rows has dimension $1+\frac{3(3-1)}2=4$ and specifically the map $$\begin{pmatrix}\lambda\\ a\\ b\\ c\end{pmatrix}\mapsto \begin{pmatrix}a&b&\lambda-(a+b)\\ b&c&\lambda-(b+c)\\ \lambda-(a+b)&\lambda-(b+c)&a+2b-\lambda+c\end{pmatrix}$$

is an isomorphism $\Phi:\Bbb R^4\to V$. Your claim is that for all $a$, $b$, $c$, $\lambda$, $x$ and $y$ it should hold that the two vectors $$\begin{pmatrix}x\\ y\\ -x-y\end{pmatrix},\quad \begin{pmatrix}a&b&\lambda-(a+b)\\ b&c&\lambda-(b+c)\\ \lambda-(a+b)&\lambda-(b+c)&a+2b-\lambda+c\end{pmatrix}\begin{pmatrix}x\\ y\\ -x-y\end{pmatrix}$$

are linearly dependent. Since two column vectors $v,w$ are linearly dependent if and only if $\rk\begin{pmatrix}v^T\\ w^T\end{pmatrix}<2$, we have that the assertion is equivalent to $$\rk\begin{pmatrix}x&y&-x-y\\ ax+by+(\lambda-a-b)(x+y)&bx+cy+(\lambda-b-c)(x+y)&\text{stuff...}\end{pmatrix}<2$$

For all $x$, $y$, $a$, $b$, $c$ and $\lambda$. However, the determinant of the submatrix made of the first two columns is already not the zero polynomial: it should be $$-2 a x y - a y^2 + 2 b x^2 - 2 b y^2 + c x^2 + 2 c x y - \lambda x^2 + \lambda y^2$$

which means that the rank of that matrix is $2$ for some $(x,y,a,b,c,\lambda)$.

For $n\ge4$ we can include a counterexample for $n=3$ in a block-diagonal matrix $\begin{pmatrix}A&0\\ 0&\lambda I\end{pmatrix}$, and then test it against the vector $(x,y,-x-y,0,\cdots,0)^T$.