I'm trying to disprove the correctness of below greedy algorithm which tries to compute the maximum matching for a bipartite graph but I'm unable to come up with a counter-example to disprove it.
- Find an edge $(u,v)$ such that u is an unmatched vertex with minimum degree and v is an unmatched endpoint with minimum degree
- Add $(u,v)$ to matching $M$ and remove it from $G$. Mark $u$ and $v$ as matched.
- Repeat 1 and 2 until there are no more edges with both endpoints being unmatched.
Any help would be appreciated. The other posts related to bipartite matching uses a different greedy algorithm to above.
Start with an example where you have a perfect matching, but which also has some wrong edge that will ruin the perfect matching if you pick it. For example, a graph that looks like the picture below.
We want the following things to happen:
So just add enough edges (carefully) to this graph that the diagonal edge in the middle is the graph picked by the greedy algorithm, and you'll have the counter-example you want.