Proof that -1 can't be an eigenvalue

193 Views Asked by At

The problem context is the average consensus algorithm implemented in a network with mutiple agents. The involved knowledge include matrix and graph thery. I need to know whther the eigenvalue of following matrix can be $-1$ or not.

Take any square matrix $D_{r \times r}$ with entries defined as follows.

$$D =\begin{pmatrix}{}{1 - \sum\limits_{j \in {N_1}} {{d_{1j}}} }& \cdots &{{d_{1i}}}& \cdots &{{d_{1r}}}\\ \cdots & \cdots & \cdots & \cdots & \cdots \\{{d_{i1}}}& \cdots &{1 - \sum\limits_{j \in {N_i}} {{d_{ij}}} }& \cdots &{{d_{ir}}}\\ \cdots & \cdots & \cdots & \cdots & \cdots \\{{d_{n1}}}& \cdots &{{d_{ri}}}& \cdots &{1 - \sum\limits_{j \in {N_r}} {{d_{rj}}} }\end{pmatrix} $$ The elements of D matrix are determined as $$ {d_{ij}} = \left\{ {\begin{array}{*{20}{c}}{2/({g_i} + {g_j} + 1)}\\{1 - \sum\limits_{i \in {N_i}} {2/({g_i} + {g_j} + 1)} }\\0\end{array}} \right.\begin{array}{*{20}{c}}{j \in {N_i}}\\{i = j}\\{otherwise}\end{array}{\rm{ }} $$ where $g_{i}$ and $g_{j}$ are node degrees, the number of agents connected to agent $i$ and $j$, respectively, $N_i$ is the set of neighbor agents connected to agent $i$. Each non-zero element in $D$-matrix is the line weights and self-weights between its two incident vertices and itself.

The sum of entries in each row and columns is $1$, where $e = [1,1,..,1]^T$. Note that the diagonal elements of D matrix can be negative, which means the D matrix is not a Doubly Stochastic Matrix.
$$\begin{array}{l}De = e\\{e^T}D = {e^T}\end{array}$$

Whether $-1$ can be an eigenvalue of D or not?

3

There are 3 best solutions below

2
On BEST ANSWER

Let's $Ax=-x$, and let's $x_k = \min\{x_1, x_2,\dots, x_n\}$, $x_l = \max\{x_1, x_2,\dots, x_n\}$. Then $$ -x_k = \sum_j A_{k,j}x_j \le x_l\sum_j A_{k,j} = x_l \;\;\Rightarrow\;\; x_l+x_k\ge 0, \\ -x_l = \sum_j A_{l,j}x_j \ge x_k\sum_j A_{l,j} = x_k \;\;\Rightarrow\;\; x_l+x_k\le 0, $$ so $x_k+x_l = 0$, and thus $x_l \ge 0$. But since $A_{l,j} > 0$ $$ -x_l = \sum_j A_{i,j}x_j \ge 0 \ge -x_l, $$ which implies $x_l = 0$, so $x_k=0$ and finally $x=0$.

0
On

Here you can use Gershgorin disc theorem

Suppose your matrix is $n \times n$, and elements of the $i$th row look like $(a_{i1}, a_{i2}, ..., a_{in})$. Conditions given are that all $a_{ij} > 0$ and $a_{i1}+a_{i2}+ .... + a_{in}=1$ for all $ 1 \leq i \leq n$.

Now applying Gershgorin disc theorem, you get the range of the eigenvalues as follows. For every eigenvalue $\lambda$, there exists an $1 \leq i \leq n$, such that \begin{align} |\lambda - a_{ii}| \leq \sum_{j \neq i, j=1\ldots n} a_{ij}. \end{align} Then, $\lambda \leq a_{ii}+ \sum_{j \neq i, j=1\ldots n} a_{ij}$ and $\lambda \geq a_{ii}- \sum_{j \neq i} a_{ij}$. Thus, $\lambda \leq 1$ and $\lambda \geq K_i$ (where $K_i$ is some constant). Actually, $$K_i =a_{ii}- \sum_{j \neq i} a_{ij}=2a_{ii}- \sum_{i} a_{ij} =2a_{ii}-1 > -1 \text{ since } a_{ii}>0.$$ Therefore, clearly $K_i$ cannot be $-1$.

So, all eigenvalues of this matrix belong to $K_i \leq \lambda \leq 1$ for some $i$, where all $K_i$ are bigger than $-1$. In this way, you can show that eigenvalues of this matrix may not be $-1$.

0
On

Perron-Frobenius theorem

Th positive version of Perron-Fobenius theorem says that for a positive matrix there is a unique real eigenvalue equal to the spectral radius and it's the only eigenvalue with modulus equal to the spectral radius.

So all you have to show is that the spectral radius of a matrix whose rows sum to one is 1.

Note that the column vector of ones $v$ is a right eigenvector for the 1 eigenspace since $Av=v$ because the rows sum to one. Again you can cite Perron-Fobenius which says the Perron eigenvector can be the only one (up to scaling) that has all positive coefficients. So $v$ is the Perron eigenvector and the corresponding eigenvalue 1 is the unique eigenvalue with modulus of the spectral radius.