I want to prove the following: Every symmetric matrix whose entries are calculated as $ 1/(n -1) $ with $n$ as the size of the matrix, except for the diagonal which is 0, has a characteristic polynomial with a root at $x=1$. In other words, every such matrix has an eigenvalue of 1.
For example Matrix 1:
\begin{array}{ccc} 0 & \frac{1}{2} & \frac{1}{2} \\ \frac{1}{2} & 0 & \frac{1}{2} \\ \frac{1}{2} & \frac{1}{2} & 0 \\ \end{array}
has a characteristic polynomial: $f(x)=-x^3+\frac{3 x}{4}+\frac{1}{4} $ ,which has a root at $x=1$
Matrix 2:
\begin{array}{cccc} 0 & \frac{1}{3} & \frac{1}{3} & \frac{1}{3} \\ \frac{1}{3} & 0 & \frac{1}{3} & \frac{1}{3} \\ \frac{1}{3} & \frac{1}{3} & 0 & \frac{1}{3} \\ \frac{1}{3} & \frac{1}{3} & \frac{1}{3} & 0 \\ \end{array}
has a characteristic polynomial: $f(x)=-(1/27) - (8 x)/27 - (2 x^2)/3 + x^4 $ ,which also has a root at $x=1$
Matrix 3:
\begin{array}{ccccc} 0 & \frac{1}{4} & \frac{1}{4} & \frac{1}{4} & \frac{1}{4} \\ \frac{1}{4} & 0 & \frac{1}{4} & \frac{1}{4} & \frac{1}{4} \\ \frac{1}{4} & \frac{1}{4} & 0 & \frac{1}{4} & \frac{1}{4} \\ \frac{1}{4} & \frac{1}{4} & \frac{1}{4} & 0 & \frac{1}{4} \\ \frac{1}{4} & \frac{1}{4} & \frac{1}{4} & \frac{1}{4} & 0 \\ \end{array}
has a characteristic polynomial: $ f(x)=(4 + 60 x + 320 x^2 + 640 x^3 - 1024 x^5)/1024 $ ,which also has a root at $x=1$
I want to show that this is true for any such n by n matrix, i.e. for all n.
Looking for some tips and tricks on how to approach this.
Let $A_n$ be the $n\times n$ matrix such as the off-diagonal entries are $1/(n-1)$ and 0 on the diagonal. Then $A_n$ can be written as
$$A_n=\dfrac{1}{n-1}(\mathbf{1}_n\mathbf{1}_n^T-I_{n})$$
where $\mathbf{1}_n$ is the vector of ones of dimension $n$. This matrix is the perturbation of a full-rank matrix $I_n$ by a rank-one matrix $\mathbf{1}_n\mathbf{1}_n^T$. Let $S\in\mathbb{R}^{n\times(n-1)},\tilde S\in\mathbb{R}^{n\times 1}$ be such that $\mathbf{1}^TS=0$, $S^TS=I$, $\tilde{S}^T\tilde{S}=1$, and $S^T\tilde{S}$.
We can see that the matrix $P=\begin{bmatrix} S & \tilde S \end{bmatrix}$ is invertible and orthogonal; i.e. $P^{-1}=P^T$.
Therefore, we can perform a basis change on $A_n$. This shows that the matrix $A_n$ is similar to $$P^TA_nP=\dfrac{1}{n-1}\begin{bmatrix} -I_{n-1} & 0\\0 & \tilde S^T(\mathbf{1}_n\mathbf{1}_n^T)\tilde S-1 \end{bmatrix}.$$
Noting that $\tilde{S}=\mathbf{1}_n/\sqrt{n}$ is a valid $\tilde S$, then we get that $\tilde S^T(\mathbf{1}_n\mathbf{1}_n^T)\tilde S-1=n-1.$
This yields that that $A_n$ is similar to $$\dfrac{1}{n-1}\begin{bmatrix} -I_{n-1} & 0\\0 & n-1 \end{bmatrix},$$
or, equivalently, to $$\begin{bmatrix} -\dfrac{1}{n-1}I_{n-1} & 0\\0 & 1 \end{bmatrix}.$$
This means that the matrix $A_n$ has one eigenvalue at 1 with multiplicity 1 and one eigenvalue at $-1/(n-1)$ with multiplicity $n-1$.
Edit. Example in the case $n=3$. Then, we have that
$$A_3=\dfrac{1}{2}(\mathbf{1}_3\mathbf{1}_3^T-I_{3}).$$
In this case we can compute $S$ and $\tilde S$ as
$$S=\begin{bmatrix}\dfrac{-\sqrt{3}}{3} & \dfrac{-\sqrt{3}}{3} \\ \dfrac{\sqrt{3}}{6}+\dfrac{1}{2} & \dfrac{\sqrt{3}}{6}-\dfrac{1}{2} \\ \dfrac{\sqrt{3}}{6}-\dfrac{1}{2} & \dfrac{\sqrt{3}}{6}+\dfrac{1}{2} \end{bmatrix},\ \tilde S=\dfrac{\sqrt{3}}{3}\begin{bmatrix}1\\1\\1 \end{bmatrix}.$$
It is quite tedious to do by hand, but there numerical methods out there that can do that for you.
We then obtain that
$$P^TA_3P=\begin{bmatrix} -\dfrac{1}{2}I_{2} & 0\\0 & -1\end{bmatrix}.$$