When is minimizing the sum of images of $f$ equivalent to minimizing the sum of independent variables?

105 Views Asked by At

I have to admit I am not good at math, but this is a problem I am having trouble with.

What kind of function $f$ can guarantee that

$$\arg\min\sum_{i=1}^Kf(x_i) = \arg\min\sum_{i=1}^Kx_i$$

Thank you. Cost Function $f$ is not explicitly defined here, and so the problem cannot be solved as a normal optimization problem. $x_i$ here are some feasible solutions of a plan. We want to know on what condition can we reduce the problem to $\min\sum_{i=1}^Kx_i$ so it can be solved. For ${x_i}$ there is no explicit constraints between each other.

1

There are 1 best solutions below

7
On

If variables $x_i$ are independent, all you need is $f$ to be a strictly increasing function.

Generally, for any function $f = g \circ h$, if $g$ is strictly increasing, then $f$ and $h$ have the same variations. In particular, $f$ and $g$ have the same optimums.

To understand where this comes from (this is not a proof), consider the mono-variable case. The derivative of $f$ with respect to variable $x$ is $$ f'(x)=g'(h)\cdot h'(x) $$ Since $g$ is strictly increasing, $g'(h)>0$, so $f'(x)$ and $h'(x)$ are simultaneously positive or negative, i.e., $f(x)$ and $h(x)$ have the same variations. The same logic holds in the multi-variable case, but you work with jacobian matrices.