Let $\Sigma$ be the the covariance matrix for some correlation coefficient $\rho>0$. $$\Sigma=\begin{bmatrix}1 & \rho &\rho&\rho&\rho\\\rho&1&\rho&\rho&\rho\\\rho&\rho&1&\rho&\rho\\\rho&\rho&\rho&1&\rho\\\rho&\rho&\rho&\rho&1\end{bmatrix}$$ Find the eigenvalues and eigenvectors of this covariance matrix.
I'm looking for some trick to find the eigenvalues and eigenvectors, because if I do $$det(\Sigma-\lambda I)=0$$ it will be very complicated to find the eigenvalues.
Any property that I can use in this case?
You can see in it the usual matrix:
where $J$ is the matrix full of 1. This matrix $J$ is an interesting one: what can be said of $J^2$? By this remarks you can find the reduction of $J$, in particular its eigenvalues and eigenvectors.
Then, what is the relations between $\rho J$ and your initial matrix? You see that:
and since $(1-\rho)I_n$ is a scalar matrix, it is the same in every base, so finding a base of diagonalization of $J$ is enough to gives you the complete reduction of $\Sigma$.