I'm wondering if there are analytical approaches to solve these problems(I found these problems in a book by Stephen Boyd):
minimize $f_0(x_1,x_2)$
subject to
- $2x_1+x_2\ge1$
- $x_1+3x_2\ge1$
- $x_1\ge0$
- $x_2\ge0$
There are five options for $f_0(x_1,x_2)$ but I'm only interested in the next two:
- $f_0(x_1,x_2)=x_1^2+9x_2^2$
- $f_0(x_1,x_2)=max\{x_1,x_2\}$
The first thing I thought was to use the gradient, but it only works on smooth functions so it would work for $f_0(x_1,x_2)=x_1^2+9x_2^2$, however it works only for unconstrained functions and this one have two linear constraints and two nonnegative constraints. The other function $f_0(x_1,x_2)=max\{x_1,x_2\}$ is not smooth so the gradient won't be useful.
Then I decided to try a numerical approach and I wrote this code for the first problem $f_0(x_1,x_2)=x_1^2+9x_2^2$ in Matlab:
fun=@(x)x(1)^2+9*x(2)^2;
A=[-2 -1;-1 -3];
b=[-1;-1];
lb=[0,0];
ub=[inf,inf];
x0=[1,1];
Aeq=[ ];
beq=[ ];
x = fmincon(fun,x0,A,b,Aeq,beq,lb,ub)
This code returned the optimal set $(x_1,x_2)=(0.5,0.1667)$. I tried to write something similar for the second problem $f_0(x_1,x_2)=max\{x_1,x_2\}$ but I was unable to do so with my current knowledge of Matlab.
Finally I tried to use a graphical approach to solve the second one and I got this picture:
3D representation of the function
Searching on that graph I was able to get a rough approximation to the optimal set $(x_1,x_2)=(0.334,0.334)$
I found the solutions to these problems. For the first one $f_0(x_1,x_2)=x_1^2+9x_2^2$ they gave an optimal set $(x_1,x_2)=(1/2,1/6)$, this is exactly the same I got with my Matlab code. For the second one $f_0(x_1,x_2)=max\{x_1,x_2\}$ they gave an optimal set $(x_1,x_2)=(1/3,1/3)$, this is almost the same I got.
So here is my question again: Is there any analytical approach to solve those problems? I asked because they gave such an exact solution that the only way to do that is to get them from an analytical approach.