how to prove $ \min_{x \ne 0, ~ 1^Tx=0}\frac{x^TLx}{||x||^2}=\lambda_2(L)$

251 Views Asked by At

For a connected graph $G$ that is undirected, how to prove the following formula? \begin{equation} \min_{x \ne 0, ~ 1^Tx=0}\frac{x^TLx}{||x||^2}=\lambda_2(L) \end{equation} wherer $L$ is the Laplacian matrix of graph $G$ and $\lambda_2$ is the Fiedler eigenvalue of $L$, i.e., the second smallest nonzero eigenvalue.

2

There are 2 best solutions below

0
On BEST ANSWER

According to Matrix Analysis by Roger A. Horn and Charles R. Johnson, a theorem related to Rayleigh Quotient is as follows (my compact version):

Theorem 4.2.2 (Rayleigh) Let $A \in M_n(\mathbb{C})$ be Hermitian, let the eigenvalues of $A$ be ordered as $\lambda_{min}(B)=\lambda_1(B) \le \lambda_2(B) \le ... \le \lambda_n(B)=\lambda_{max}(B)$, let $i_1,...,i_k$ be given integers with $1 \le i_1 < ... < i_k \le n$, let $x_{i_1},...,x_{i_k}$ be orthonormal and such that $A x_{i_p} = \lambda_{i_p} x_{i_p}$ for each $p=1,...,k$, and let $S=span\{x_{i_1},...,x_{i_k}\}$. Then \begin{equation}\label{eq34} \lambda_{i_1}=\min_{\{x:0 \ne x \in S\}} \frac{x^{*}Ax}{x^{*}x} \end{equation} where $x^{*}$ denotes the conjugate transpose of $x$. The minimum value is achieved if and only if $Ax=\lambda_{i_1}x$.

Then we can have the following corollary:

Corollary Let $A \in M_n$ be real symmetric, let the eigenvalues of $A$ be ordered as in $\lambda_{min}(B)=\lambda_1(B) \le \lambda_2(B) \le ... \le \lambda_n(B)=\lambda_{max}(B)$, let $x_{1},...,x_{n} \in \mathbb{R}^n$ be orthonormal and such that $A x_{i} = \lambda_{i} x_{i}$ for each $i=1,...,n$, and let $S=span\{x_{2},...,x_{n}\}$. Then \begin{equation}\label{eq35} \lambda_{2} = \min_{\{x \in \mathbb{R}^n : x \ne 0, x_1^Tx=0\}} \frac{x^{T}Ax}{x^{T}x} \end{equation} The minimum value is achieved if and only if $Ax=\lambda_{2}x$.

Proof: We consider Theorem 4.2.2 in the field of $\mathbb{R}$. Moreover, it is a basic fact that there is an orthonormal basis of $\mathbb{R}^n$ consisting of eigenvectors of $A$, i.e. $span\{x_{1},...,x_{n} \}=\mathbb{R}^n$. Also note that $S=span\{x_{2},...,x_{n}\} = \{x\in \mathbb{R}^n: x_1^Tx=0\} \subseteqq \mathbb{R}^{n-1}$. By letting $k=n-1$ and $i_1=2, i_2=3, ... ,i_k=n$, the remaining of the proof follows from Theorem 4.2.2, i.e. $ \lambda_{2} = \min_{\{x \in \mathbb{R}^n : 0 \ne x \in S\}} \frac{x^{T}Ax}{x^{T}x} = \min_{\{x \in \mathbb{R}^n : x \ne 0, x_1^Tx=0\}} \frac{x^{T}Ax}{x^{T}x} . $

Proof of my question: Since $\xi_1 = \frac{1}{\sqrt{n}} \boldsymbol{1}$ is an eigenvector associated with the simple eigenvalue $\lambda_1=0$ of $L$, $\xi_1$ is automatically orthogonal to the space spanned by the rest of the eigenvectors of the (symmetric) Laplacian matrix $L$. According to the Corollary, it is easy to derive that \begin{equation} \min_{x \ne 0, ~ 1^Tx=0}\frac{x^TLx}{||x||^2}=\lambda_2(L) \end{equation}

23
On

We assume $\lambda_1 \leq \lambda_2 \leq \cdots \leq \lambda_n.$

Let $R_L(x)=\dfrac{x^TLx}{x^Tx}$ be the Rayleigh quotient associated with hermitian matrix $L$.

Initial remark: if $P$ is invertible, we know that $P^{-1}LP$ and $L$ have the same spectrum, thus have the same Rayleigh quotient.

The min-max theorem relates $\lambda_k$ to the set of $n-k+1$ dimensional subspaces in the following way: \begin{equation}\tag{1} \lambda_k=\max_U \ \{\min_x R_L(x) | \ x \in U \ \text{and} \ x \neq 0 \} \ | \ dim(U)=n-k+1 \ \} \end{equation}

and applying it to the case $k=2$,

\begin{equation}\tag{2} \lambda_2=\max_U \ \{\min_x R_L(x) | \ x \in U \ \text{and} \ x \neq 0 \} \ | \ dim(U)=n-1 \ \} \end{equation}

But we know that each hyperplane has an equation $x^Tv=0$ characterized by a normal vector $v$, that can be assumed with norm equal to $\sqrt{n}$ (this will be useful later).

Thus (2) can be written under the form:

\begin{equation} \tag{3} \lambda_2=\max_{v \in S} \ \underbrace{\{\min_x R_L(x) | x^Tv=0 \ \text{and} \ x \neq 0 \}}_{int.} \ \} \end{equation}

where $S$ is the sphere of radius $\sqrt{n}$ in $\mathbb{R}^n$.

Here comes an essential feature that I had some difficulty to find.

Let us introduce the (Householder) symmetry matrix $H$ (that should have been denoted $H_v$) that sends $v$ to $\mathbb{1}$ (the vector with all its coordinates equal to $1$ : this is possible because $v$ and $\mathbb{1}$ have the same norm).

\begin{equation}\tag{4} Hv=\mathbb{1} \ \ \ \iff \ \ \ v=H^{-1}\mathbb{1} \end{equation}

As $H$ is an orthogonal transformation $H^{-1}=H^T$, it is an isometry, and in particular it is a bijection; thus the set of vectors of the form $y=Hx$ is identical to the set of vectors $x$.

This allows to transform the set "int." in (3) into:

\begin{equation}\tag{5} \{\min_{x:=Hy} \dfrac{(Hy)^TL(Hy)}{(Hy)^T(Hy)} \ | \ (Hy)^Tv=0 \ \text{and} \ x \neq 0 \} \end{equation}

which is equal to \begin{equation}\tag{6} \{\min_{y} \dfrac{(y)^T(H^TLH) (y)}{(y)^T(H^TH)y} \ | \ (y)^T(H^Tv)=0 \ \text{and} \ x \neq 0 \} \end{equation}

i.e, finally, because $H^T$ is the inverse of $H$,

\begin{equation}\tag{7} \{\min_{y} \dfrac{(y)^T(L')(y)}{y^Ty} \ | \ (y)^T \mathbb{1}=0 \ \text{and} \ y \neq 0 \} \end{equation}

where $L':=H^TLH=H^{-1}LH$ has the same spectrum as $L$ and in particular the same second eigenvalue (see initial remark upwards).

Taking the max of these sets is now reduced to ... nothing because $v$ doesn't vary any more !