How to "convexify" a non-convex function?

3.3k Views Asked by At

In the following paper

there is a non-convex function [figure 1]. Does it mean that I can't find the optimal values of $p, q, \alpha, R$ to maximize $\sum R_k$?

And the author just change the variable $p,q$ to $m,s$ and told that the non-convex had change into convex,and they use the Lagrange multiplier to find the optimal value. So,is it use Lagrange multiplier to let the non-convex become convex? Because i don't think that he can transform this function from non-convex to convex by only change two variable.

enter image description here

(1)This function is non-convex,and $R_k$ is the function of $p,q,\alpha$. $R$ is {$R_k$}

enter image description here

(2) This function is convex,and $m=${$\alpha_k q_k/2$},$s=${$\alpha_0 p_0,\alpha_k q_k/2$},$15b$~$15f$ is the constraint

1

There are 1 best solutions below

10
On BEST ANSWER

It seems that they are correct. The simplest form of the problem can be thought of as the following $$ \max_{\alpha, p} \; \alpha \log(1+p) \quad \text{s.t.} \quad \alpha p \le 1,\; \quad \alpha,p \ge0$$ and they do a change of variable $m = \alpha p$ to obtain $$ \max_{\alpha, m} \; \alpha \log(1+m/\alpha) \quad \text{s.t.} \quad m \le 1, \quad \alpha, m \ge 0.$$ The constraints of this problem are clearly convex (i.e., affine inequalities). It only remains to verify that $f(\alpha,m) = \alpha \log(1+m/\alpha)$ is concave jointly in its variables. This is easy to verify using the Hessian $$ \nabla^2 f(\alpha,m) = \frac{1}{(\alpha + m)^2} \begin{pmatrix} -m^2/\alpha & m \\ m & - \alpha \end{pmatrix}. $$ The matrix above has nonpositive diagonals over $C = \{(\alpha,m) \mid \alpha,m \ge 0\}$ and the determinant is $0$. Hence, $\nabla^2 f(\alpha,m) $ is nonpostive definite over $C$, i.e. $f$ is concave over $C$. Since maximizing a concave function over a convex set is a convex optimization problem, the second problem is convex.

Here is a plot of function $f$: Plot of function $f$

Extension. What they in fact have is more like $$ \max_{\alpha, p, \beta, q} \; \min\{\alpha \log(1+p), \beta\log (1+q)\} \quad \text{s.t.} \quad \alpha p \le 1,\quad \beta q \le 1\quad \alpha,p, \beta,q \ge0$$ which they turn into $$ \max_{\alpha, m, \beta, s} \; \min\{\alpha \log(1+m/\alpha), \beta\log (1+s/\beta)\} \quad \text{s.t.} \quad m \le 1,\quad s \le 1\quad \alpha,m, \beta,s \ge0.$$ The convexity of the second problem follows from the above argument and that the minimum of a bunch of concave functions is again concave.

Update. Here is another trick they are using. Suppose you want to solve the following problem, $$ (P) \quad \max_x \sum_{k=1}^n \min(g_{1,k}(x),g_{2,k}(x)). $$ Let $g_k(x) =\min(g_{1,k}(x),g_{2,k}(x))$. Assuming that $g_{1,k}$ and $g_{2,k}$ are concave functions, then so is $g_k$ (check!). So the above problem is a convex problem. We can write it in epigraph form: $$ (P') \quad \max_{x, t_1,\dots,t_n} \; \sum_{k=1}^n t_k \quad \text{s.t.} \quad t_k \le \min(g_{1,k}(x),g_{2,k}(x)) , \;\text{for all $k=1,\dots,n$}. $$ To see that this is equivalent to the original problem, first optimize over $t_1,\dots,t_k$ (and then over $x$). Optimizing over $t_k$ forces us to have $t_k = \min(g_{1,k}(x),g_{2,k}(x))$, giving back the original problem $(P)$.

Now, we can equivalently write $(P')$ as \begin{align*} (P'') \quad \quad &\max_{x, t_1,\dots,t_n} \; \sum_{k=1}^n t_k \quad \text{s.t.} \\ &\qquad t_k \le g_{1,k}(x), \quad, t_k \le g_{2,k}(x) , \;\text{for all $k=1,\dots,n$}. \end{align*} This problem is equivalent to the original problem. If $x^*,t_1^*,\dots,t_k^*$ is an optimal solution of $(P'')$, then $x^*$ is an optimal solution of $(P)$ and we have $t^*_k = \min(g_{1,k}(x^*),g_{2,k}(x^*)) = g_k(x^*)$