A $\{0,1\}$-matrix with positive spectrum must have all eigenvalues equal to $1$

4.6k Views Asked by At

Here's a cute problem that was frequently given by the late Herbert Wilf during his talks.

Problem: Let $A$ be an $n \times n$ matrix with entries from $\{0,1\}$ having all positive eigenvalues. Prove that all of the eigenvalues of $A$ are $1$.

Proof:

Use the AM-GM inequality to relate the trace and determinant.

Is there any other proof?

2

There are 2 best solutions below

3
On

Since $\det(A) \neq 0$, $A$ is nonsingular and all diagonals of $A$ must equal one. Next, we know the relationship between the determinant and trace is \begin{align*} \det(A) &= \exp(tr(\ln(A))) \\ &= \exp(0) \\ &= \exp(\sum_i \ln(\lambda_i)) \\ \end{align*}. Step 2 holds because all diagonals of $A$ are one, so their log is zero, and step 3 because natural logs are analytic. This suggests $$\sum_i \ln(\lambda_i) = 0$$, or, $$\prod_i \lambda_i = 1$$.

Then, because all diagonals of $A$ are one, $tr(A) = \sum_i \lambda_i = n$ must be true. Hence, the AM-GM inequality states $$ \left[ \prod_{i} \lambda_i \right]^{\frac{1}{n}} \leq \frac{1}{n} \sum_{i} \lambda_{i}$$, with equality holding only when $\lambda_i = \lambda_j$ $\forall i,j$. Thus, $$ 1^{\frac{1}{n}} = \frac{1}{n} \cdot n$$, and all eigenvalues of $A$ must be equal. Since $\prod_i \lambda_i = 1$, all eigenvalues must be one.

4
On

Suppose that A has a column with only zero entries, then we must have zero as an eigenvalue. (e.g. expanding det(A-rI) using that column). So it must be true that in satisfying the OP's requirements we must have each column containing a 1. The same holds true for the rows by the same argument. Now suppose that we have a linear relationship between the rows, then there exists a linear combination of these rows that gives rise to a new matrix with a zero row. We've already seen that this is not allowed so we must have linearly independent rows. I am trying to force the form of the permissible matrices to be restricted enough to give the result. Linear independence of the rows gives us a glimpse at the invertability of such matrices but alas not its diagonalizability. The minimal polynomial $(1-r)^n$ results from upper or lower triangular matrices with ones along the diagonal and I suspect that we may be able to complete this proof by looking at what happens to this polynomial when there are deviations from the triangular shape. The result linked by user1551 may be the key. Trying to gain some intuition about what are the possibilities leads one to match the binomial theorem with Newton's identities: http://en.wikipedia.org/wiki/Newton%27s_identities#Expressing_elementary_symmetric_polynomials_in_terms_of_power_sums

and the fact that the trace must be $n$(diagonal ones) and the determinant 1. I would like to show that any deviation from this minimal polynomial must lead to a non-positive eigenvalue. Two aspects of the analysis, a combinatorial argument to show what types of modifications(from triangular) are permissible while maintaining row/column independence and looking at the geometrical effects of the non-dominant terms in the resulting characteristic polynomials. Maybe some type of induction argument will surface here.