Can I use slack variable to transform my optimization problem?

298 Views Asked by At

I have an optimization problem as follows which is non-convex and hard to solve \begin{equation*} \begin{aligned} &\displaystyle \min_{\substack{\boldsymbol{\{x_i^k\}}}} && \sum\limits_{i \in \mathcal{M}}\sum\limits_{k\in\mathcal{N}}x_i^k\\ &\text{subject to.} &&C1:~~\sum\limits_{i \in \mathcal{M}}\sum\limits_{k\in\mathcal{N}}\log_2(1+\dfrac{x_i^k }{\sum\limits_{j \in \mathcal{M}, j\neq i}a_j^k+\sigma^2})\leq A,\\ & &&C2:~~\sum\limits_{k\in\mathcal{N}}\log_2(1+\dfrac{x_i^k }{\sum\limits_{j \in \mathcal{M}, j\neq i}a_j^k+\sigma^2})\geq B_i,~~~\forall i \in \mathcal{M},\\ & &&C3:~~\sum\limits_{k\in\mathcal{N}} x_i^k \leq \overline{x}_i,~~~\forall i \in \mathcal{M}. \end{aligned} \end{equation*}

The constraint C1 makes the optimization problem non-convex. So, to solve this optimization problem, I define slack variable $\zeta_i^k$ denotes $\log_2(1+\dfrac{x_i^k }{\sum\limits_{j \in \mathcal{M}, j\neq i}a_j^k+\sigma^2})$. Since $\log_2(1+\dfrac{x_i^k }{\sum\limits_{j \in \mathcal{M}, j\neq i}a_j^k+\sigma^2})$ does not exist in the objective function, can I use the slack variable $\zeta_i^k$ to transform problem formulation as follows??? \begin{equation*} \begin{aligned} &\displaystyle \min_{\substack{\boldsymbol{\{x_i^k,\zeta_i^k\}}}} && \sum\limits_{i \in \mathcal{M}}\sum\limits_{k\in\mathcal{N}}x_i^k\\ &\text{subject to.} &&C1:~~\log_2(1+\dfrac{x_i^k }{\sum\limits_{j \in \mathcal{M}, j\neq i}a_j^k+\sigma^2})\geq \zeta_i^k,~~~\forall i \in \mathcal{M},~~~\forall k \in \mathcal{N},\\ & &&C2:~~\sum\limits_{i \in \mathcal{M}}\sum\limits_{k\in\mathcal{N}}\zeta_i^k\leq A,\\ & &&C3:~~\sum\limits_{k\in\mathcal{N}}\zeta_i^k\geq B_i,~~~\forall i \in \mathcal{M},\\ & &&C4:~~\sum\limits_{k\in\mathcal{N}} x_i^k \leq \overline{x}_i,~~~\forall i \in \mathcal{M},\\ & &&C5:~~x_i^k , \zeta_i^k \geq 0,~~~\forall i \in \mathcal{M},~~~\forall k \in \mathcal{N}. \end{aligned} \end{equation*}

1

There are 1 best solutions below

0
On

Suppose you solve the original problem, but without constraint C1. That problem is convex. Moreover, in the optimal solution there will be equality in all the constraints of C2, or otherwise we could decrease some $x_i^k$ and still have a feasible point with better objective. (Here I used the fact that $a_j^k$ are nonnegative).

Now this solution either satisfies C1 or not, but this has nothing with $x$ to do. It depends only on whether $\sum B_i\leq A$ or not.