There is a lemma that states :"Let B be the incidence matrix of the graph X, and let L be the line graph of X. Then $B^T B = 2I + A(L).$" (Note: $A(L)$ is the adjacency matrix of $L$)
Does anyone know how to verify this? Please do so with love.
There is a lemma that states :"Let B be the incidence matrix of the graph X, and let L be the line graph of X. Then $B^T B = 2I + A(L).$" (Note: $A(L)$ is the adjacency matrix of $L$)
Does anyone know how to verify this? Please do so with love.
On
While considering $B^tB$ its $ij$ entry is inner product of $i$ and $j$ th columns of $B$.
Each column represents an edge( incident to $2$ vertices) so have exactly two non zero entry. So, inner product of a column with itself is $2$.
Inner product of $2$ different columns is $1$ if they have non zero entry at a common place. i.e if two Edges have one common end vertex.
Now think about Line graph of a given graph and follow the result.
Let $n$ be the number of vertices in the graph. Let $B$ is the incidence matrix of a graph $X$ defined as $$B_{ij} = 1$$ if there is an edge $j$ incident to vertex $i$, and $0$ otherwise.
There are two cases to consider. For the diagonal entries: $$(B^TB)_{kk} = \sum_{i = 1}^n B^T_{ki}B_{ik} = \sum_{i = 1}^n B_{ik}^2$$ Since every edge $k$ has two vertices $i_1, i_2$ incident to it, the sum on the RHS is $2$.
For the off-diagonal entries: $$(B^TB)_{ij} = \sum_{k=1}^n B^T_{ik}B_{kj} = \sum_{k = 1}^n B_{ki}B_{kj}$$ The summand on the RHS is $1$ if and only if there is a common vertex $k$ incident to edges $i$ and $j$. That is the definition of the entries of the adjacency matrix of the the line graph $L$.