Deficient version of Halls Theorem to Show a Bipartite Graph has a matching

135 Views Asked by At

   QUESTION

I am not quite sure how to tackle this question. I understand that using the deficient version it suffices to reduce the graph $G$ to a graph $G'$ by adding suitable vertices to each vertex class which satisfies Halls Theorem, and then do remove edges from the matching to obtain a matching in $G$. I have tried adding $k$ and $k+1$ vertices to each vertex class but I do not arrive at the correct answer.

1

There are 1 best solutions below

0
On

The deficient form of Hall's theorem says that the maximum size of a matching in a bipartite graph with color classes $S$ and $T$ is $$|S| - \max_{X \subseteq S}(|X| - |N_G(X)|), $$

where $N_G(X)$ is the set of neighbours of $X$ in $G$. I've learnt this under the name Ore's theorem (not to be confused with the theorem of the same name on Hamiltonian graphs); it can be proved by adding vertices to $G$ in a certain way and then using Hall's theorem.

Using this result and some algebraic manipulation, your task is reduced to showing that for every subset $X \subseteq S$ of vertices from one color class, we have $$ |X| \leq \frac{n}{k+1} + |N_G(X)|.$$ One way to derive this is to look at the set of edges going between $X$ and $N_G(X)$. You can use the conditions on the minimum and maximum degree to bound this set from below by $k|X|$ and from above by $(k+1)|N_G(X)|$ and from these it's not too hard to obtain the desired inequality.