Linearization of a min function

27 Views Asked by At

I'm trying to solve an optimization problem with a constraint in the form of

$\alpha + \sum^K_{k=1}\frac{x[k]}{\min(f_1(x[k]),\cdots,f_{n_k}(x[k]))} \le \epsilon$, where $x[k] > 0,\, \forall k$ are the optimization variables

My question: Is it correct to assume that this constraint is equivalant to :

$\alpha + \sum^K_{k=1} x[k]\max_{{j}_{k}}\{\frac{1}{f_{{j}_{k}}(x[k ])}\} \le \epsilon \longleftrightarrow \alpha + \sum^K_{k=1} \frac{x[k]}{f_{{j}_{k}}(x[k ])} \le \epsilon\quad \forall k,{{j}_{k}} $

where the functions $f_{j_k}(x[k])=\log(1+\beta_{j_k}x[k]),\, \forall j_k,k$ are concave and are always positive for $x[k]>0$, and $\alpha,\epsilon, \beta> 0$

Thanks