Largest eigenvalue of a subgraph of a graph.

1k Views Asked by At

If $H$ is a subgraph of a graph $G$ then $\lambda_1(H) \leq \lambda_1(G)$, being $\lambda_1$ the largest eigenvalue of $H$ and $G$ respectively (which are defined as the eigenvalues of the adjancecy matrix of $H$ and $G$ respectively).

This is easy to prove if $G$ is connected and non-bipartite using Perron-Frobenius Theorem, but I cannot prove it the other cases.

A similar question that I cannot solve is:

Let $G$ be a connected graph. If $\lambda_1(H) = \lambda_1(G)$ then $H=G$ (In fact, is and if and only if).

Any idea on how to do it?

3

There are 3 best solutions below

2
On BEST ANSWER

Based on the proof for why $\lambda_1(H) \le \lambda_1(G)$, we can figure out what happens if $\lambda_1(H) = \lambda_1(G)$ by thinking through the implications when the inequalities are tight. (Note that the linked answer assumes $\lambda_n$ is the largest eigenvalue, but here I assume that $\lambda_1$ is the largest to be consistent with the question.)

Let $\mathbf w$ be a unit eigenvector of $A_H$ corresponding to $\lambda_1(H)$. We can assume that $w_i \ge 0$ for all $i$, because $\mathbf w$ is supposed to maximize the quadratic form associated to $A_H$, and replacing $w_i$ by $|w_i|$ can only increase that quadratic form. We also assume that $A_H$ and $A_G$ have the same size, by including isolated vertices in $H$ if necessary.

From the inequality chain $$ \lambda_1(H) = \sup_{\mathbf x \in \mathbb R^n : \|\mathbf x\|=1} \mathbf x^{\mathsf T}\!A_H\mathbf x = \mathbf w^{\mathsf T}\!A_H\mathbf w \le \mathbf w^{\mathsf T}\!A_G\mathbf w \le \sup_{\mathbf x \in \mathbb R^n : \|\mathbf x\|=1} \mathbf x^{\mathsf T}\!A_G\mathbf x = \lambda_1(G) $$ we conclude that when $\lambda_1(H) = \lambda_1(G)$, we must have $\mathbf w^{\mathsf T}\!A_H\mathbf w = \mathbf w^{\mathsf T}\!A_G\mathbf w$ and also $\mathbf w^{\mathsf T}\!A_G\mathbf w = \lambda_1(G)$.

Focus on the second equation first; it will show that $\mathbf w$ is a $\lambda_1(G)$-eigenvector of $A_G$. To see this, write $\mathbf w$ in an orthonormal eigenvector basis $\mathbf v^{(1)}, \dots, \mathbf v^{(n)}$ of $A_G$ as $$\mathbf w = c_1 \mathbf v^{(1)} + \dots + c_n \mathbf v^{(n)}.$$ We assume $\|w\|=1$, so $c_1^2 + \dots + c_n^2 = 1$. Then $\mathbf w^{\mathsf T} \!A_G \mathbf w = \lambda_1(G) c_1^2 + \lambda_2(G) c_2^2 + \dots + \lambda_n(G) c_n^2$. This is a convex combination of the eigenvalues, and the only way it can be equal to $\lambda_1(G)$ is if $c_i = 0$ for all $i$ with $\lambda_i(G) < \lambda_1(G)$. But this means that $\mathbf w$ is a linear combination of the $\lambda_1(G)$-eigenvectors of $A_G$, so it is also such an eigenvector.

Now, return to the first equation. We have $$0 = \mathbf w^{\mathsf T}\!A_G\mathbf w - \mathbf w^{\mathsf T}\!A_H\mathbf w = \sum_{ij \in E(G)} 2w_iw_j - \sum_{ij \in E(H)} 2w_iw_j = \sum_{ij \in E(G) \setminus E(H)} 2w_iw_j.$$ By nonnegativity of $\mathbf w$, for every edge $ij$ that appears in $G$ but not in $H$, $w_iw_j=0$. In particular, either $G=H$ and we are done, or else some $w_i$ must be $0$.

Because $\mathbf w$ is an eigenvector, we know $A_G \mathbf w = \lambda_1(G)\mathbf w$. Then from $w_i = 0$ we conclude $$\sum_{j : ij \in E(G)} w_j = (A_G\mathbf w)_i = \lambda_1(G) w_i = 0.$$ Since all components of $\mathbf w$ are nonnegative, this means that $w_j=0$ for all $j$ such that $ij \in E(G)$: the zero components propagate to adjacent vertices.

This is true for any $i$ such that $w_i=0$: whenever $\mathbf w$ is zero at a vertex, it's zero at the vertex's neighbors. But then it's zero at those neighbors' neighbors, too, and so on... $G$ is connected, so by repeating this argument, we conclude that $\mathbf w = \mathbf 0$, which contradicts the assumption that $\mathbf w$ is an eigenvector of anything.

1
On

For a not connected graph, note that the eigenvalues of the whole graph are just the union of the eigenvalues of the individual components so you can prove this componentwise.

I have no idea how this works for bipartite graphs.

0
On

Easy proofs:

For the first question:
let each have an n x n adjacency matrix, given by $A_G$ and $A_H$ respectively.

now consider, for $t\in [0,1]$
$F_G(t) = (1-t) \mathbf{11}^T + t\cdot A_G$
$F_H(t) = (1-t) \mathbf{11}^T + t\cdot A_H$

for t $\in [0,1)$
$\lambda_{max}\big(F_G(t)\big)\geq \lambda_{max}\big(F_H(t)\big)$
because $F_G(t) \geq F_H(t)$ (component wise) and all entries of the matrix are are positive so the result follows from Perron theory.

then for t $ = 1$
$\lambda_{max}\big(A_G\big) =\lambda_{max}\big(F_G(t)\big)\geq \lambda_{max}\big(F_H(t)\big) = \lambda_{max}\big(A_H\big)$
by (topological) continuity of eigenvalues.

For the second question:
We can assume WLOG that $A_H$ is a connected graph with the same n states as $A_G$. If not, create $A_H^*$ that is connected --e.g. $A_H^* := \frac{1}{2}\big(A_H + A_G\big)$ --and $A_H\leq A_H^* \leq A_G$ with the right componentwise inequality strict in at least one place. Re-running the first argument tells us $\lambda_{\max}(A_H) \leq \lambda_{\max}(A_H^*)$. The below argument would then proceed to show $\lambda_{\max}(A_H^*)\lt \lambda_{\max}(A_G)$. But again, for simplicity we proceed by assuming WLOG that that $A_H$ is a connected graph with the same n states as $A_G$.

from part 1 we know $\lambda_{max}\big(A_G\big) \geq \lambda_{max}\big(A_H\big)$
so divide out $\lambda_{max}\big(A_G\big)$ and we consider new graphs where $\lambda_{max}\big(A_G'\big)=1 \geq \lambda_{max}\big(A_H'\big)$

now effect similarity transform and consider
$P_G = D^{-1}A_G'D$
$P_H = D^{-1}A_H'D$
where $D=\text{Diag}(\mathbf v)$ and $\mathbf v$ is the Perron vector of $A_G$.

$P_G \mathbf 1 = \mathbf 1$ (i.e. this is a stochastic matrix)
and $P_H \mathbf 1 \leq \mathbf 1$ (componentwise) with equality iff $A_G = A_H$. Otherwise $P_H$ is strictly substochastic in at least one row $i$ which means that state $i$ is transient, but since the graph is connected, and this is a communicating class property, all states in $A_H$ are transient so the matrix is not positive recurrent and there is no positive vector with eigenvalue one $\longrightarrow \lambda_{max}\big(A_H'\big) \lt 1$

an alternative finish is to consider
$S_n^{(H)} = \frac{1}{n}\big(P_H + P_H^2 + P_H^3 + ... + P_H^{n-1} + P_H^n\big)$
this matrix is strictly positive (by connected graph). Since $P_H\mathbf 1 \leq \mathbf 1 \text{ suppose strict in some component} \longrightarrow S_n^{(H)}\mathbf 1 \leq \mathbf 1 \text{ strict in some component}$.
(Use associativity and non-negativity)

By Perron theory $\lambda_{max}\big(S_n^{(H)}\big)\lt 1$ which implies $\lambda_{max}\big(P_H\big) \lt 1= P_G$ and completes the proof.