I have an optimization problem in the form
$$\text{Minimize}\hspace{1mm}D$$
$$\text{subject to}$$
$$\sigma_1+\sigma_2=\sigma$$
$$\rho_1+\rho_2=\rho$$
$$\epsilon\le\rho_i\le c_i\hspace{1mm},i=1,2$$
$$\sigma_i\ge 0\hspace{1mm},i=1,2$$
$$(c_i-\rho_i)D\ge \sigma_i+f_i(c_i-\rho_i)$$
I think a feasible solution can be found by bisection search.
Here $\sigma_1$, $\sigma_2$, $\rho_1$ and $\rho_2$ are optimization variables.
For given values of $\sigma$, $\rho$, $c_1$, $c_2$, $f_1$, $f_2$ and $\epsilon$, How Can we solve this problem?
For example, $\sigma=25$, $\rho=10$, $f_1=4$, $f_2=5$, $c_1=2.5$, $c_2=3$ and $\epsilon=0.0001$.
When you fix your $D$, your problem become a linear programming problem.
Hence one possible approach is to start with a value of $D$, and solve the corresponding LP.
If it is feasible, try a smaller value of $D$. If it is not feasible, try a bigger value of $D$.
We can first try $D=0$. If it is not feasible, we know the minimal $D$ is positive.If it is feasible, the minimal $D$ is nonpositive.
If $D>0$, you can try steps of the form of $2^{i}$, $i=1,2,3,\ldots$ until it is feasible. Then zoom down to fidn the best $D$.
If $D \leq 0$, reduce $D$ until it become infeasible.