Matrix optimization with Lagrange keeping row and column sums constant.

264 Views Asked by At

I have a question regarding the "classic" Lagrange optimization in a special matrix case:

If I need to optimize an objective function wrt to a $K\times N$-matrix $X$ such that row and column sums of $X$ remain unchanged, how do I formulate that as a Lagrange optimization problem?

The problem can be written as follows:

$$\min_{x_{ki}\ge0} \quad f(X)$$

s.t.

$$h_i = \sum_k x_{ki} - s_i = 0 \quad \forall i$$

$$h_k = \sum_i x_{ki} - t_k = 0 \quad \forall k$$

where $x_{ki}$ are the corresponding elements of $X$.

In order to fulfill the Karush-Kuhn-Tucker conditions, the constraints need to be lin. independent. This is clearly not the case here as the $k$th-row sum is known, given the column sums and $K-1$ other row sums.

So I am wondering: Is it enough to remove one row/column sum and then formulate the problem as usual, i.e. $L(X, \mu, \lambda) = f(X) + \sum_{i=1}^N \mu_i h_i + \sum_{k=1}^{K-1}\lambda_k h_k$ ??