Spectral graph theory problem

159 Views Asked by At

Given a directed graph $G$ with outdegree $k$ for all vertices and no directed 2-cycles, show that $G$ must have some nonreal eigenvalue.

I can show it for $k=1$, where it is easy to explicitly calculate the eigenvalue. But already for $k=2$, I guess some other idea is required.