Non-negative diagonal perturbation of Laplacian matrix

373 Views Asked by At

In this previous question is stated that given a weighted undirected Laplacian corresponding to a connected graph $L$ it's well known that if you add a small positive (resp. negative) amount to any diagonal element of , the zero eigenvalue is pushed into the right (resp. left) half plane.

I tried to find the result in the literature and even though I found quite a lot of paper on perturbed Laplacian matrices, I still did not find a way to prove the statement in italic above. The definition of perturbed Laplacian matrices is not related to 'typical perturbation theory' in case this creates any confusion. You can refer here for a definition of perturbed laplacian.

It is the property stated in the question general for any matrices with zero-sum rows? Is the property valid for any singular matrix? How would I in general approach such a problem? Also, does it hold for any positive value or it has to be sufficiently small?

Thank you!

---------------------- UPDATE -------------------

What happens if instead of $L$, we consider $L+\gamma I$ where $I$ is the identity matrix? How can I quantify the effect of perturbing just one diagonal entry? In case $\gamma=0$ (question above) it is enough to know that we can 'move' the $0$ eigenvalue to the left or to the right with minimal diagonal perturbation. But how we can quantify this effect?

3

There are 3 best solutions below

2
On BEST ANSWER

Steph's approach is the way I would think about quantifying how matrix perturbations affect the matrix eigenvalues. I want to point out, though, that your specific highlighted statement is a consequence of just elementary properties of semidefinite matrices; namely, if $A$ and $B$ are positive-semidefinite then $$\ker (A+B) = \ker A \cap \ker B$$ which you can see by probing $v^T(A+B)v$.

Assuming you know that the kernel of $L$ is one-dimensional (spanned by the constant vector $\mathbf{1}$) for a connected graph, it follows immediately that $L+D$ is (strictly) positive-definite for any nonzero, non-negative diagonal matrix $D$, and so the zero eigenvalue became positive. Also since $$\mathbf{1}^T(L-D)\mathbf{1} = -\mathbf{1}^T D\mathbf{1} < 0$$ you know that $L-D$ has at least one negative eigenvalue. Note though that this simple analysis isn't enough to tell you that there is only one negative eigenvalue when $D$ is a sufficiently small perturbation.

3
On

Let us write $\mathbf{L}=\sum_k \lambda_k \mathbf{u}_k \mathbf{u}_k^T$. It follows $\lambda_i = \mathbf{u}_i^T \mathbf{L} \mathbf{u}_i$ and $d\lambda_i = \mathbf{u}_i\mathbf{u}_i^T:d\mathbf{L}$. The colon operator denotes the Fronebius inner product here.

The eigenvector associated to the zero eigenvalue is the vector of ones $\mathbf{e}$, i.e. $\mathbf{Le}=0\cdot \mathbf{e}$

For this particular eigenvalue $d\lambda_0 = \mathbf{J}:d\mathbf{L}$ where $\mathbf{J}=\mathbf{e}\mathbf{e}^T$ is populated with ones.

If you perturb one element of the diagonal of $\mathbf{L}$ by $h$, we can conclude $d\lambda_0 = h$ Thus if $h>0$, the (zero) eigenvalue will increase.

11
On

idea: take $L$ and use very simple (i.e. diagonal matrix) congruence and similarity transforms to make this a result about stochastic matrices which are especially easy to work with; e.g. multiplication by a positive diagonal matrix does not change irreducibility. The same technique can be used to prove e.g. that $\dim \ker L$ gives the number of connected components (because the algebraic multiplicity of the Perron root of a stochastic matrix counts such thing.)


$L=D-A$
where $A$ is the adjacency matrix for your connected graph. Now effect a congruence transform with $D^\frac{-1}{2}$
$ D^\frac{-1}{2}\big(D-A\big)D^\frac{-1}{2}=I -D^\frac{-1}{2}AD^\frac{-1}{2} = I-B$
and congruence of course preserves rank.

$B$ is an irreducible real-non-negative matrix with that single Perron root $\lambda =1$. Using $\Gamma := \text{diag}(\mathbf v)$ the Perron vector of $B$ we see $S:= \Gamma^{-1} B \Gamma $ where $S$ is a stochastic matrix.

Now if you change some diagonal component of $D$ (WLOG the first component) -- by any amount so long as it stays a positive matrix, then this is equivalent to multiplying by

$\Sigma =\begin{bmatrix} \alpha & \mathbf 0\\ \mathbf 0 & I_{n-1} \end{bmatrix}$ for $\alpha \in (0,1)$ to decrement and $\alpha \gt 1$ to increment,


side note: if for some reason we wanted to consider the case where $D$ had a non-positive diagonal element it would immediately follow that the 'Laplacian' was no longer PSD, giving the result.


essentially running through the same steps as before:

$L\to L'=\Sigma D - A$ and the congruence transform now gives us
$I -\Sigma^\frac{-1}{2} D^\frac{-1}{2}AD^\frac{-1}{2}\Sigma^\frac{-1}{2} = I-\Sigma^\frac{-1}{2} B\Sigma^\frac{-1}{2}=I-B'$
$S':= \Gamma^{-1}B' \Gamma= \Gamma^{-1}\Big(\Sigma^\frac{-1}{2} B\Sigma^\frac{-1}{2}\Big) \Gamma=\Sigma^\frac{-1}{2}\Big( \Gamma^{-1} B \Gamma\Big)\Sigma^\frac{-1}{2} = \Sigma^\frac{-1}{2} S \Sigma^\frac{-1}{2}$
(since diagonal matrices commute). By design $S'$ is no longer stochastic and we can exploit this. In particular if $\alpha \in (0,1)$ then the row sum in the first row increases and it does not decrease in any other row. And if $\alpha \gt 1$ then the first row sum decreases and it does not decrease in any other row.

Finally Perron-Frobenius theory tells us that for an irreducible non-negative matrix
$\text{min row sum }S' \leq \text{Perron root }S' \leq \text{max row sum }S'$
and both inequalities are strict unless $\text{min row sum } = \text{max row sum }$ (which was the case with stochastic $S$ and is not the case with $S'$). Thus we recover Perron root $\lambda \gt 1$ for $\alpha \in(0,1)$ and Perron $\lambda \in (0,1)$ for $\alpha \gt 1$. This is preserved under similarity transform back to $B'$ and of course the mapping to $I-B'$ is $\lambda \to 1-\lambda$, and congruent matrices have the same signature which gives the result for $L'$.