The title is quite explicit, however I will give the context. I'm referring to spin-glass theory in classical statistical mechanics, in particular to the Sherrington-Kirkpatrick model. Studying the stability of the Replica Symmetric (RS) solution one must compute the eigenvalues of an Hessian matrix which looks like the following matrix: $$\begin{pmatrix} P & Q & Q & Q & Q & R \\ Q & P & Q & Q & R & Q \\ Q & Q & P & R & Q & Q \\ Q & Q & R & P & Q & Q \\ Q & R & Q & Q & P & Q \\ R & Q & Q & Q & Q & P \end{pmatrix}$$ but the size is not fixed and can increase, thus the structure of the matrix becomes much more complicated and cannot simply written with a general expression. What happen is that the sum of the rows (or columns) is always the same value, which implies that an eigenvector is the one with all equal elements and the corresponding eigenvalue is the sum of the row. Other reasonings can be done to compute the other two distinct eigenvalues (for instance another eigenvector is given by something like $v_a+v_b$ where $\sum_{a}v_a$=0, but I don't understand why).
What I ask here is that if there is a way to prove that a symmetric matrix with only three distinct elements has always only three distinct eigenvalues. Possibly I would like to have a way to compute directly the eigenvectors and eigenvalues without diagonalize the matrix (which is an impossible task for a general size of the matrix, as I need).