Max and min eigenvalues of the "normalized" adjacency matrix of a path

2.6k Views Asked by At

Setup: Let $A$ be the $n \times n$ adjacency matrix of a path with $n$ nodes, so the $ij^\text{th}$ element of $A$ is $a_{ij} = 1(|i-j|=1)$. I.e. the elements just off the main diagonal are $1$, and everything else is $0$.

Now define $B$ as equal to $A$, but with row sums normalized to equal $1$. I.e., $b_{ij} = a_{ij} / a_{i+}$, where $a_{i+}$ is the $i^\text{th}$ row sum of $A$. Here are the matrices:

$$ A = \begin{bmatrix} 0&1 \\ 1&0&1&& \mathbf 0 \\ &1&0& \ddots \\ &&\ddots&\ddots &1& \\ & \mathbf 0&&1&0&1 \\ &&&&1&0 \end{bmatrix},\,\,\,\, B = \begin{bmatrix} 0&1 \\ 1/2&0&1/2&& \mathbf 0 \\ &1/2&0& \ddots \\ &&\ddots&\ddots &1/2& \\ & \mathbf 0&&1/2&0&1/2 \\ &&&&1&0 \end{bmatrix}. $$

Note that the 1st and last rows have 1 non-zero element (since the nodes at the ends of the path have only 1 neighbor), while the middle rows have two non-zero elements (since the middle nodes have a neighbor on each side).

My question: Can you prove that the max and min eigenvalues of $B$ are $1$ and $-1$, respectively, for any $n \ge 2$? I've checked that this is true (with a computer program) for all $n$ from 2 to 500. So I'm pretty confident that it's true in general, but I'm wondering if there's a proof.

Context: The reason I care about this is that $B$ appears in a statistical model I'm using (a CAR spatial model), and the max and min eigenvalues of $B$ determine the parameter space for a parameter in the model.

This is clearly closely related to the following asked-and-answered question: Computing eigenvalue of the adjacency matrix of a path. There they wanted all eigenvalues of $A$, whereas I want the max and min eigenvalues of $B$.

Also, although this might not be helpful, here's a plot of all $n$ eigenvalues of $B$, for $n = 2,\ldots,16$.

2

There are 2 best solutions below

2
On BEST ANSWER

The row sums of your matrix are 1 and by Gersgorin's theorem this will be the largest eigenvalue. There is a diagonal matrix $D$ such that $DBD^{-1}$ is symmetric. (Exercise.) The matrix $B$ is a weighted adjacency matrix of a bipartite graph, so its spectrum is symmetric about zero, whence $-1$ is an eigenvalue.

4
On

Ok, after a lot of brainstorming, I came up with a proof. The advantages are that it's very constructive, gives a bit of an algorithmic picture of what goes on with the Eigenvalues, and uses nothing abstract (no graph theory!). The disadvantages?...Well...it's pretty dang technical/inefficient--you may even want to go as far as say that this is "brute force". The first answer gives the more "eloquent use of pretty theorems" approach.

Let $B_n$ denote the normalized $n \times n$ adjacency matrix. We will begin by proving the following claims.

Claim 1. Given an eigenvalue $\lambda$ of $B_n$ and an eigenvector $\left[\begin{matrix} a_1 \\ \vdots \\ a_n \end{matrix}\right]$, we have $a_2=\lambda a_1$, $a_{n-1}=\lambda a_n$ and $a_k=2\lambda a_{k-1}-a_{k-2}$ for $2<k \leq n$.

Proof. Observe the recursive pattern \begin{align*} \left[\begin{matrix} a_2 \\ \frac{a_1}{2}+\frac{a_3}{2} \\ \vdots \\ \frac{a_{n-2}}{2}+\frac{a_n}{2} \\ a_{n-1} \end{matrix}\right]=B_n\left[\begin{matrix} a_1 \\ a_2 \\ \vdots \\ a_{n-1} \\ a_n \end{matrix}\right]=\lambda \left[\begin{matrix} a_1 \\ a_2 \\ \vdots \\ a_{n-1} \\ a_n \end{matrix}\right]. \quad \square \end{align*}

Claim 2. Given an eigenvalue $\lambda$ of $B_n$, define \begin{align*} p_k(\lambda)=\begin{cases} 1, & \text{if $k=1$,} \\ \lambda, & \text{if $k=2$,} \\ 2\lambda p_{k-1}(\lambda)-p_{k-2}(\lambda), & \text{if $k>2$.} \end{cases} \end{align*} We find $\left[\begin{matrix} p_1(\lambda) \\ \vdots \\ p_n(\lambda) \end{matrix}\right]$ is an eigenvector.

Proof. choose an eigenvector $\left[\begin{matrix} a_1 \\ \vdots \\ a_n \end{matrix}\right]$ of $\lambda$. Using Claim 1, one can show that $\left[\begin{matrix} a_1 \\ \vdots \\ a_n \end{matrix}\right]=a_1 \left[\begin{matrix} p_1(\lambda) \\ \vdots \\ p_n(\lambda) \end{matrix}\right]$, which means $a_1$ must be non-zero. Our conclusion follows. $\quad \square$

Claim 3. $\lambda$ is an eigenvalue of $B_n$ if and only if $\lambda$ is a solution to $0=\lambda p_n(\lambda)-p_{n-1}(\lambda)$. (we have more generally found that $\lambda p_n(\lambda)-p_{n-1}(\lambda)$ is the "characteristic polynomial" of $B_n$)

Proof. $\implies$ Follows from applying Claim 1 to the eigenvector $\left[\begin{matrix} p_1(\lambda) \\ \vdots \\ p_n(\lambda) \end{matrix}\right]$ obtained from Claim 2 and getting $p_{n-1}(\lambda)=\lambda p_n(\lambda)$.

$\impliedby$ Follows from the fact that \begin{align*} \left[\begin{matrix} p_2(\lambda) \\ \frac{p_1(\lambda)+p_3(\lambda)}{2}\\ \vdots \\ \frac{p_{n-2}(\lambda)+p_n(\lambda)}{2} \\ p_{n-1}(\lambda) \end{matrix}\right]=B_n \left[\begin{matrix} p_1(\lambda) \\ p_{n-1}(\lambda) \\ \vdots \\ p_{n-1}(\lambda) \\ p_n(\lambda) \end{matrix}\right]=\lambda \left[\begin{matrix} p_1(\lambda) \\ p_2(\lambda) \\ \vdots \\ p_{n-1}(\lambda) \\ p_n(\lambda) \end{matrix}\right], \end{align*} by our converse hypothesis (the hypothesis is applied to the $n$th entry in particular). $\quad \square$

Now to prove your question:

Final Proof. It shall suffice by Claim 3 to show that all real solutions to $0=\lambda p_n(\lambda)-p_{n-1}(\lambda)$ are within the bounds of $[-1, 1]$. It is easy to verify using our recursive construction that $-1, 1$ is a solution to $0=\lambda p_n(\lambda)-p_{n-1}(\lambda)$, for all $n \geq 2$. Next, we check that all zero solutions are bounded by $[-1, 1]$. This is done by just using calculus and recursion to show that $\lambda p_n(\lambda)-p_{n-1}(\lambda)$ never hits zero outside of $[-1, 1]$. $\quad \square$

The final step is rather inconvenient, but I believe it's do-able since $\lambda p_n(\lambda)-p_{n-1}(\lambda)$ turns out to be monotonic outside of $[-1, 1]$. I'll leave it up to you unless you find it too hard/tedious and rather have me cook it up.

UPDATE: Corrected my mistake on Claim 3. Should have written "solution", not "nonzero solution".