Largest laplacian eigenvalue lower bound

910 Views Asked by At

I'm currently dealing with a problem that I'm unsure how to start.

Let G be a graph with largest Laplacian eigenvalue $\lambda_1$ and maximum degree $\Delta > 0$. Show that $\lambda_1 \geq \Delta + 1 $.

I have tried using that the sum of the eigenvalues of the Laplacian is the trace of Laplacian, but that got me nowhere. Any tips appreciated!

2

There are 2 best solutions below

0
On

The correct place to start, when it comes to interpreting the largest eigenvalue of any symmetric matrix, is almost always the following lemma:

If $A$ is a symmetric matrix, then its largest eigenvalue is the maximum value of $\frac{\mathbf x^{\mathsf T}\!A \mathbf x}{\mathbf x^{\mathsf T}\mathbf x}$, achieved when $\mathbf x$ is the corresponding eigenvector.

For the Laplacian matrix in particular, it's convenient to keep in mind that $\mathbf x^{\mathsf T}L \mathbf x = \sum_{ij \in E}(x_i -x_j)^2$, where the sum is over all edges of the graph.

Usually, the actual eigenvector is some complicated mess that's hard to interpret from the point of view of classical graph theory. However, we can try to find nice and simple vectors that seem like they'd be pretty good at maximizing the quantity above, and get a lower bound for the largest eigenvalue.

In particular, if your goal is to prove that $\lambda_1 \ge \Delta+1$, it's reasonable to try to use the existence of a degree-$\Delta$ vertex to find some $\mathbf x$ such that $\frac{\mathbf x^{\mathsf T}L \mathbf x}{\mathbf x^{\mathsf T}\mathbf x} \ge \Delta+1$.

My first try was to set $x_i = 1$ for a single vertex of degree $\Delta$, and $x_j = 0$ for all $j \ne i$. You can check that this only tells us $\lambda_1 \ge \Delta$. Instead, setting $x_i = \Delta$, $x_j = -1$ for all $\Delta$ vertices $j$ adjacent to $i$, and $x_k=0$ for the rest works.

(Why should this second vector work better? Well, we've made it more eigenvector-like. All eigenvectors of $L$ are orthogonal, and one of them is the all-ones eigenvector $\mathbf 1$. So we improve $\mathbf x$ by making sure that all its entries add up to $0$: that $\mathbf x^{\mathsf T} \mathbf 1 = 0$.)


There's a second way to think about this problem, which is derived from the first. If $H$ is a subgraph of $G$, then $\lambda_1(H) \le \lambda_1(G)$.

(The argument here is that whatever the eigenvector $\mathbf x$ corresponding to $\lambda_1(H)$ is, it does at least as well at maximizing $\frac{\mathbf x^{\mathsf T}L_G \mathbf x}{\mathbf x^{\mathsf T}\mathbf x}$ as it did at maximizing $\frac{\mathbf x^{\mathsf T}L_H \mathbf x}{\mathbf x^{\mathsf T}\mathbf x}$. Pad $\mathbf x$ by zeroes first if $H$ was missing some vertices of $G$.)

Saying "$G$ has maximum degree at least $\Delta$" is equivalent to saying "$G$ has a subgraph isomorphic to $K_{1,\Delta}$". So if we can show that $\lambda_1(K_{1,\Delta}) \ge \Delta+1$, we immediately get that for all other graphs with maximum degree at least $\Delta$.

0
On

Let $L$ be the Laplacian matrix of $G = (V, E)$.

The key idea is to consider the Rayleigh quotient (https://en.wikipedia.org/wiki/Rayleigh_quotient) which is frequently used to estimate the largest eigenvalue. For convenience, I copy the definition of Rayleigh quotient here:

$R(L) := \max\limits_{x \neq 0}\frac{x^TLx}{x^Tx} = \max\limits_{||x||=1}x^TLx = \lambda_1$.

Below is a critically important lemma for Laplacian matrix:

$\textbf{Lemma 1}:$

$x^TLx = \sum\limits_{(i, j) \in E}(x_i - x_j)^2$.

Proof to Lemma 1 could be found in almost every textbook on algebraic graph theory. You may check Lemma 3.6 in https://math.uchicago.edu/~may/REU2013/REUPapers/Marsden.pdf.

Here, we attempt to find a unit vector $x$ such that $\sum\limits_{(i, j) \in E}(x_i - x_j)^2 \geq \Delta + 1$. Instead of using complicated Lagrange method, we directly construct the vector $x$. Suppose vertex $v \in V$ has the largest degree and the index of $x$ is $i$. Let $v_1,\,v_2,\,...,\,v_\Delta$ are $\Delta$ vertices connected to $v$ with indices $j_1, j_2, ..., j_\Delta$.

We set:

$\lambda \in \mathbb{R},\,0 < \lambda < \sqrt\frac{1}{\Delta}$.

$x_{j_1} = x_{j_2} = ... = x_{j_\Delta} = -\lambda$.

$x_i = \sqrt{1 - \Delta \lambda^2}$.

$x_k = 0$ if $x_k$ is not in the set $\{x_i, x_{j_1}, ..., \,x_{j_\Delta}\}$.

We want to find a $\lambda$ such that:

$\sum\limits_{k=1}^\Delta (x_i - x_{j_k})^2 = \Delta(\sqrt{1 - \Delta \lambda^2} + \lambda)^2\geq \Delta + 1$ (1).

Arrage Eq. 1 into a quadratic inequality:

$(\Delta+1)\lambda^2 - 2\sqrt{1+\frac{1}{\Delta}}\lambda + \frac{1}{\Delta} \leq 0$ (2).

The disciminant of the left hand side of Eq.2 is $0$. Therefore,

$\lambda = \frac{\sqrt{1+\frac{1}{\Delta}}}{\Delta+1}$ (3).

In this setting, $x$ is a unit vector and $\sum\limits_{(i, j) \in E}(x_i - x_j)^2 = \Delta + 1$. Hence, $\lambda_1 = \max\limits_{||x||=1}x^TLx = \max\limits_{||x||=1}\sum\limits_{(i, j) \in E}(x_i - x_j)^2 \geq \Delta + 1$.

We can easily prove that the bound is tight. Just consider a connected undirected graph with two vertices please.