This is a mathematics question originating from a problem in physics, so I will give the physics problem, too, to clarify what I mean.
Physics problem
Consider N identical masses joined by identical springs to form a circle. This problem is clearly symmetric under rotation by $\frac{2 \pi}{N}$, because all springs and masses are identical. One can write the equations of motion for each mass as a vector ODE $$\ddot{x} = A \cdot x .$$ The solution to this is a superposition of oscillations with frequencies being the square root of the eigenvalues of the matrix.
Mathematics problem
We want to find the eigenvalues of A using the symmetry described above. Let $$S(x_1, ..., x_N) = (x_2, ..., x_{N+1} = x_1) $$ be the matrix describing the symmetry. Because it is a symmetry, we have $S \cdot A = A \cdot S$ and therefore $$ S \cdot x = \lambda x \implies S \cdot A \cdot x = \lambda \cdot A \cdot x$$
Up to this point, this makes sense to me. However, the book now states:
Therfore, it is sufficient to solve the eigenvalue problem of A in every eigenspace of S (reduction of dimension)
This, I don't understand. What is meant by "solving the eigenvalue problem of a matrix in the eigenspace of another matrix" and how does this simplify the initial problem of finding the eigenvalues of A?
Because of the symmetry you can assemble the coordinates of $x$ as a polynomial $$ x(t,z)=x_0(t)+x_1(t)z+x_2(t)z^2…+x_{n-1}(t)z^{n-1} $$ and the first row of the matrix $$ a(z)=a_{00}+a_{0,1}z^{n-1}+a_{0,2}z^{n-2}+…+a_{0,n-1}z $$ so that your equation then reads $$ \ddot x(t,z)=a(z)·x(z)\pmod{z^n-1} $$ Evaluated at the roots $z=\xi_k=\exp(i\frac{2\pi k}n)$ of $0=z^n-1$ this reduces to the scalar equations $$ \ddot x(t,\xi_k)=a(\xi_k)·x(\xi_k) $$ This evaluation at unit roots corresponds to the inverse Discrete Fourier Transform. To get the coefficients of $x$ back you then only need to apply the forward transform to the solutions, $$ x_j(t)=\frac1n\sum_{k=0}^{n-1}{\bar \xi_k}^j·x(\xi_k) $$