Doing a Charnes-Cooper transformation with matrices and an zero-one constraint

680 Views Asked by At

I need to solve an assignment problem with the following objective function: $${\max} \frac{\displaystyle\sum_{i=1}^m\sum_{j=1}^n h_{ij}\cdot x_{ij}}{\displaystyle\sum_{i=1}^m\sum_{j=1}^n c_{ij}\cdot x_{ij}}$$ Since this is a fractional problem I want to use the Charnes-Cooper transformation (https://en.wikipedia.org/wiki/Linear-fractional_programming) to get a linear problem.

My first question is, can I just do that even though I've got matrices instead of vectors? Second is, $\alpha$ as well as $\beta$ would in my case both just be zero and disappear, right? If I'm not missing something, this would make this whole constraint $$ \mathbf{d}^{T} \mathbf{y}+\beta t=1 $$ which gets introduced by that transformation also dissappear (or make it trivial), because if $\beta$ equals zero and $$ \mathbf{y}=\frac{1}{\mathbf{d}^{T} \mathbf{x}+\beta} \cdot \mathbf{x} $$ you end up with $$ \mathbf{\frac{\mathbf{d}^{T} \mathbf{x}}{\mathbf{d}^{T} \mathbf{x}}} = \mathbf{1} \ $$ Correct?

Additionaly, as stated in the title, I've got the following zero-one constraint: $$ x_{i j} \in\{0,1\} $$ which after the Charnes-Cooper transformation looks like this $$ y_{i j} \in\{0, t\} $$ because $$ \mathbf{x}=\frac{1}{t} \mathbf{y} $$

Will I get into trouble solving this because my zero-one constraint now contains a variable instead of just the simple values 0 and 1?

1

There are 1 best solutions below

2
On BEST ANSWER

The shape of the decision variables (matrix versus vector) does not matter. Yes, $\alpha=\beta=0$ in your case. The idea of the transformation is that you multiply both numerator and denominator by a variable $t$ so that the denominator becomes 1. You want to $$\text{maximize $\sum_{i,j} h_{i,j}\cdot t\cdot x_{i,j}$ subject to $\sum_{i,j} c_{i,j}\cdot t\cdot x_{i,j} = 1$}.$$ Now introduce $y_{i,j} = t\cdot x_{i,j}$ to linearize both objective and constraint: $$\text{maximize $\sum_{i,j} h_{i,j}\cdot y_{i,j}$ subject to $\sum_{i,j} c_{i,j}\cdot y_{i,j} = 1$}$$ Finally, linearize the relationship between $y$ and $x$. We want to enforce: $$y_{i,j}=\begin{cases}t &\text{if $x_{i,j} = 1$}\\0 &\text{if $x_{i,j} = 0$}\end{cases}$$ You can do that as follows, where $M$ is an upper bound on $t$, for example $M=1/\min_{i,j} \{c_{i,j}\}$: \begin{align} 0 \le y_{i,j} &\le M x_{i,j}\\ y_{i,j} - t &\le M (1 - x_{i,j})\\ y_{i,j} - t &\ge -M (1 - x_{i,j})\\ \end{align} You mentioned "assignment problem," so I assume you also have linear constraints like this: $$\sum_j x_{i,j} = 1 \quad \text{for all $i$}$$ (or maybe with $i$ and $j$ reversed). In that case, you can use compact linearization instead: \begin{align} 0 \le y_{i,j} &\le M x_{i,j}\\ \sum_j y_{i,j} &= t \end{align}