Is this a convex optimization problem? How to solve it?

324 Views Asked by At

I have the following optimization problem in $0 < \alpha, \beta < 1$.

$$\text{minimize} \quad \frac{{\alpha \left( {{2^{\frac{R}{\alpha }}} - 1} \right)}}{{{K_1}\beta }} + \frac{{\left( {1 - \alpha } \right)\left( {{2^{\frac{R}{{1 - \alpha }}}} - 1} \right)}}{{{K_2}\left( {1 - \beta } \right)}}$$

where $K_1 > 0$, $K_2 > 0$, and $R > 0$. For example, $K_1 = 1000$, $K_2 = 8000$, $R = 2$.

Is this a convex optimization problem? How can I verify this? How to solve this problem?

1

There are 1 best solutions below

1
On

What have you tried? If you plot it, it definitely appears strictly convex with a minimum in the interior.

Standard approach to prove strict convexity would be to derive the Hessian and show that it is positive definite.

Just about any standard solver strategy should work. I tested it in MATLAB with the modelling toolbox YALMIP (disclaimer, developed by me) with the standard nonlinear solver fmincon, and it easily finds the optimal solution

sdpvar a b
K1 = 1e3;
K2 = 8e3;
R = 2;
f = (a.*(2.^(R./a)-1)./(K1*b)) + (1-a).*(2.^(R./(1-a)-1))./(K2*(1-b));
optimize([.001 <= [a b] <= 0.9999],f);
value([a b])

ans =

0.6070    0.7167