The optimal value function over all doubly stochastic matrices

60 Views Asked by At

Let $I = J = \{1,\dots,n\}$. Define set $X \subset \Bbb R^{I \times J}$ as all $n \times n$ doubly stochastic matrices $x = (x_{ij})$ satisfying

$$\sum_{j=1}^n x_{ij} = 1, \quad \forall i $$ $$\sum_{i=1}^n x_{ij} = 1, \quad \forall j $$ $$x_{ij} \geq 0, \quad \forall i,j$$

For each $S\subseteq I\times J$, define the optimal value

$$ V(S)=\max_{x\in X}\sum_{(i,j)\in S} x_{ij}$$

From Birkhoff's theorem, we know that an optimal solution is attained at some permutation matrix. What does $V(S) $ look like (i.e., an explicit formula to compute $V(S)$) and is there any useful property for $V$?

1

There are 1 best solutions below

0
On

Are you familiar with bipartite matching ?

You can represent the problem by drawing to columns of n points and connecting each point from one column to each point of the other column by an edge. In a matching, you try to select edges so that they wan no endpoints in common. If the variable $x_{ij}$ is a indicator (0/1) variable representing the fact that edge $\{ i,j \}$ is selected, then your first two conditions translate the fact that the edges have no endpoints in common.

Any permutation $\sigma$ translates to a perfect matching by setting $x_{i,\sigma (i)}=1$ and the rest of the variables to 0.

Your set $S$ can be interpreted as a subset of edges. This corresponds to considering a bipartite graph that isn't necessarily complete. For the optimal solution to your problem, you can assume that $x_{i,j}=0$ if $\{ i,j \}$ isn't in $S$, for in an optimal solution, you can decrease these variables to $0$ without violating constraints and changing objective value.

So you're asking for a maximum matching in a bipartite graph. These can be computed algorithmically with a Ford-Fulkerson-type algorithm using augmenting paths.