Suppose there is a symmetric matrix such that the sum of elements in every row equals the sum of elements in every column. One example of such a matrix is:
\begin{equation*} A = \begin{bmatrix} a & b & b \\ b & a & b \\ b & b & a \end{bmatrix} \end{equation*}
I have noticed that every vector whose elements sum up to zero is an eigenvector of a matrix satisfying the above criteria. How to prove this and what would be the way to tell what the eigenvalue corresponding to such an eigenvector is?
Thank you!
The claim is false. Maybe these examples are clearer than Gae. S's. Let $$A=\pmatrix{0&0&1&0\\0&0&0&1\\1&0&0&0\\0&1&0&0},\quad B=\pmatrix{1&0&1&0\\0&1&0&1\\1&0&1&0\\0&1&0&1}, \quad\text{ and }\quad v=\pmatrix{1\\-1\\0\\0}.$$ It is easy to check that the matrices $A$ and $B$ are symmetric with all row and column sums equal and that the elements of vector $v$ sum up to zero. But, contrary to what the OP asserts, $v$ is not an eigenvector of $A$ or of $B$.
In general, if $A$ is symmetric and has all row sums equal to $s$, then the vector of all $1$s is an eigenvector of $A$, with eigenvalue $s$. If $v$ is an eigenvector corresponding to a different eigenvalue, then the sum of the entries in $v$ vanishes.