reduced matrix will always have an eigenvector orthogonal to vector of all ones?

256 Views Asked by At

Suppose $G$ is a simple graph on $n$ vertices, with a vertex $w$ of degree $n-1$. $A$ is corresponding adjacency matrix. If we remove row and column corresponding to $w$, and on non digonal entries replace $0$ to $2$, is it true that reduced matrix will always have an eigenvector orthogonal to vector of all ones?

EDIT: The matrix one gets is distance matrix of graph (deleted one row and one column)

I feel it is true. I tried a few examples and it is holding good there. I have no idea how to prove it. Please help.

1

There are 1 best solutions below

4
On BEST ANSWER

What is the point of the vertex $w$ of degree $n-1$? [EDIT: I see, this gives the interpretation of $M$ as the distance matrix of the original graph.] To any graph $G^{\prime}$ on $n-1$ vertices, you can add a vertex $w$ connected to everything to get a graph $G$ on $n$ vertices with this property. Better to just start with $G^{\prime}$. Then you claim the adjacency matrix $A$ (I'm already considering the row/column "removed"), if we replace all $0$ off the diagonal with a $2$, has an eigenvector orthogonal to $j$, the all one's matrix.

The matrix you are talking about can be written $M=2J-A-2I$ (where $J$ is the all-ones matrix). Your claim is that $j^{\perp}$ contains an eigenvector of $M$. Note that $\dim(j^{\perp}) = n-2$ and $M$ is a symmetric $(n-1)\times(n-1)$ matrix and so is diagonalizable; that means there is a basis for $\mathbb{R}^{n-1}$ of eigenvectors for $M$. If any eigenspace of $M$ has dimension $>1$, then this is definitely true (this eigenspace will have to intersect $j^{\perp}$ nontrivially by a dimension argument). So you if you want a counterexample, you will need to look at graphs for which $M$ has $n-1$ distinct eigenvalues (and are not regular; if the graph is regular then $M$ will have $j$ as an eigenvector, and any eigenvector for another eigenvalue will be in $j^{\perp}$).