Is it possible to split the optimization problem into multiple sub problem if the objective function are product of univariate function?

37 Views Asked by At

I am a post graduate electrical engineering student who is working with some optimization. Particularly, the objective function of my problem has the form

$\begin{array}{*{20}{c}} {\left( P \right)}&{\mathop {\min }\limits_{x,y,z} }&{{g_1}\left( x \right) \times {g_2}\left( y \right) \times {g_3}\left( z \right)}\\ {}&{s.t}&{{x_L} \le {f_1}\left( x \right) \le {x_U}}\\ {}&{}&{{y_L} \le {f_2}\left( y \right) \le {y_U}}\\ {}&{}&{{z_L} \le {f_3}\left( z \right) \le {z_U}} \end{array}$

Here $x_L$,$y_L$ and $z_L$ are some numerical lower bound. Likewise $x_U$,$y_U$ and $z_U$ are some numerical upper bound.

Therefore, my question is that can I find the optimal decision variable $x,y,z$ of problem $(P)$ by solving these 3 separates univariate problem $P_1$, $P_2$ and $P_3$ ?

$\begin{array}{*{20}{c}} {\left( {{P_1}} \right)}&{\mathop {\min }\limits_x }&{{g_1}\left( x \right)}\\ {}&{s.t}&{{x_L} \le {f_1}\left( x \right) \le {x_U}} \end{array}$

$\begin{array}{*{20}{c}} {\left( {{P_2}} \right)}&{\mathop {\min }\limits_y }&{{g_2}\left( y \right)}\\ {}&{s.t}&{{y_L} \le {f_2}\left( y \right) \le {y_U}} \end{array}$

$\begin{array}{*{20}{c}} {\left( {{P_3}} \right)}&{\mathop {\min }\limits_z }&{{g_3}\left( z \right)}\\ {}&{s.t}&{{z_L} \le {f_3}\left( z \right) \le {z_U}} \end{array}$

Also,if the answer is 'yes' then what would be the rigorous explanation for this approach ?

Note that sometimes $f(.)$ and $g()$ are not differentiable.

Thank you for your enthusiasm !

1

There are 1 best solutions below

4
On BEST ANSWER

Without restrictions on the functions $g_1,g_2,g_3$ the general answer is no, you cannot separately optimize the variables. For example,

For a concrete example consider $g_1 = g_2 = g_3 = f_1 = f_2 = f_3$ where the common function is $g_1(t) = t$ and $$x_L = -1, x_U = 1, y_L=-1,y_U = 1, z_L = 1,z_U = 2.$$ In this case the minimum is obtained with $x = -1, y = 1, z=2$ and obtains $xyz = -2$. If the variables are optimized separately you will have $x = -1, y = -1, z = 1$ and obtain $xyz = +1$.


However, all of these issues go away if $g_1,g_2,g_3 \geq 0$ (or at least they are non-negative on the feasible set). To see this, let $(x_j,y_j,z_j)$ be the optimal argument when the variables are jointly considered. Let $(x_s,y_s,z_s)$ be the optimal arguments for each problem separately. A quick check shows that $(x_s,y_s,z_s)$ is feasible, and therefore since $(x_j,y_j,z_j)$ obtains the minimum we have \begin{align} g_1(x_j)g_2(y_j)g_3(z_j) \leq g_1(x_s)g_2(y_s)g_3(z_s). \end{align} However, since $x_j,y_j,z_j$ are separately the best and $g_1,g_2,g_3$ are non-negative it holds that \begin{align} g_1(x_j)g_2(y_j)g_3(z_j) &\geq g_1(x_s)g_2(y_j)g_3(z_j) & (g_1(x_j) \geq g_1(x_s)) \\ &\geq g_1(x_s)g_2(y_s)g_3(z_j) & (g_2(y_j) \geq g_2(y_s)) \\ &\geq g_1(x_s)g_2(y_s)g_3(z_s) & (g_3(z_j) \geq g_3(z_s)) \end{align} This shows that actually you have $$ g_1(x_j)g_2(y_j)g_3(z_j) = g_1(x_s)g_2(y_s)g_3(z_s) $$ That is, the separate minimization actually achieves the same minimum as the joint minimization.


Notes

  1. The key to making this all work is that $g_1,g_2,g_3$ are all non-negative which is what ensures that none of the signs end up flipping in the chain of inequalities above.

  2. The same result can be reached using properties of logarithms if you assume $g_1,g_2,g_3$ but is a bit annoying if you allow them to equal 0.

  3. Induction can extend this to $N$ products

  4. The constraints being separable is also critical to ensuring that the separate optimization returns a feasible solution.