Given a diagonalizable matrix $M$ (that is, a normal matrix), can we determine whether the matrix has degenerate eigenvalues without explicitly calculating all the eigenvalues and eigenvectors?
An example that came to my mind is that $M$ is square of a skew-Hermitian matrix since a skew-symmetric matrix always has pairs of pure imaginary eigenvalues $\pm i \lambda_i$. Similarly, a matrix that is square of a matrix that has pairs of eigenvalues with different signs such as $\lambda_1,-\lambda_1,\dots$ is such a case. However, these things require sorts of decomposition of the matrix $M$, which is another problem!
Another way is calculating the characteristic polynomial $\det(M-\lambda I)=0$ and factorize it, then check the degrees of each terms. But this amounts to calculating all the eigenvalues already.
Do we have other (simple) criteria or ways to determine the degeneracy of eigenvalues of a matrix $M$?
I can think of three criteria, although the question of whether or not they are "simple" might be subject to debate.
The first criterion is as follows: a normal $n\times n$ matrix $A$ has degenerate eigenvalues if and only if the matrices $$ I, A, A^2, \ldots , A^{n-1} $$ are linearly dependent. The reason is that multiple eigenvalues lead to a minimal polynomial with degree strictly less than $n$.
Another, slightly more algebraic criterion is: a normal $n\times n$ matrix $A$ has degenerate eigenvalues if and only there are two matrices $B$ and $C$, both of which commute with $A$, but such that $B$ does not commute with $C$.
The reason is that, viewing things from the point of view of a basis consisting of eigenvectors of $A$, ordered in such a way that the first two basis vactors correpond to the same eigenvalue, one has that $$ A=\pmatrix { \lambda & 0 & 0 & \ldots & 0 \cr 0 & \lambda & 0 & \ldots & 0 \cr 0 & 0 & \lambda _3 & \ldots & 0 \cr \vdots & \vdots & \vdots & \ddots & 0 \cr 0 & 0 & 0 & \ldots & \lambda _n \cr}, $$ and any matrix of the form $$ \pmatrix { a & b & 0 & \ldots & 0 \cr c & d & 0 & \ldots & 0 \cr 0 & 0 & 0 & \ldots & 0 \cr \vdots & \vdots & \vdots & \ddots & 0 \cr 0 & 0 & 0 & \ldots & 0 \cr} $$ commutes with $A$. So one can easily choose two non-commuting matrices $B$ and $C$ of that form.
A slight reformulation of the second criterion above is as follows: a normal $n\times n$ matrix $A$ has degenerate eigenvalues if and only the number of independent solutions of the linear equation $$ AX-XA=0, $$ where the unknown $X$ is an $n\times n$ matrix, is strictly bigger than $n$.