Algorithmic Graph Theory - perfect matching in bipartite graph when det(A) is not 0

158 Views Asked by At

While learning about graphs, I came across theorem that I don't quite understand, and can't find a proof.

If G is bipartite, and $\det(A) \neq 0,$ then G has a perfect matching. (Given that matrix representation of G is A)

Ps.: I found bits of information, that proof may be connected with Edmonds Theorem, which I cannot find, since Edmonds-Karp Algorithm is way more popular in google ;)

1

There are 1 best solutions below

2
On

Let the vertices be $[n]=\{1,\dots,n\}.$ The theorem follows from the definition of determinant: $$\det{A}=\sum_{\sigma\in S_n}(-1)^\sigma a_{i\sigma(i)}$$ where $S_n$ is the set of at permutations on $[n].$ Since $a_{ij}$ is $0$ or $1$, there must be a permutation $\sigma$ such that $i$ and $\sigma(i)$ are adjacent for $i\in [n].$

Since $\sigma$ is a bijection, the two bipartition sets have the same cardinality, and the restriction of $\sigma$ to one of them gives a perfect matching.