Proof that bipartite graph has perfect matching if and only if zero sub-matrix is not included

584 Views Asked by At

How to proof that

Binary matrix of $n\times n$ dimension contains $n$ ones none of which two lie in the same row or column, if and only if matrix that represents graph doesn't contain zero sub-matrix $M$ of dimension $k\times (n-k+1)$ for $k=1,...,n$. Columns and rows of sub-matrix don't have to be subsequent column/rows of the output matrix.

1

There are 1 best solutions below

0
On BEST ANSWER

Hall's marriage theorem says that a bipartite graph, with parts $A$ and $B$ has a perfect matching iff the Hall condition holds:

Hall condition: For every subset $S$ of $A$ with $k$ vertices, the number of vertices in $B$ adjacent to some vertex in $S$ is at least $k$.

Let's say the rows of the matrix correspond to vertices of $A$, and the columns are vertices of $B$.

Suppose that the Hall condition fails. This means there exist $k$ rows whose neighbors only consist of $k-1$ or fewer columns. But then the $k\times (n+1-k)$ sub-matrix, defined by those rows and the columns which are not adjacent to those rows, is filled with zeroes.

Similarly, if you find a $k\times (n+1-k)$ sub-matrix of zeroes, then those $k$ rows will only be adjacent to $k-1$ columns, so the Hall condition fails.