I would like to find an algorithm to find the optimal (or a close approximation) of the minimum of $f(X,Y)$: $$ f(X,Y)=\sum^N_{i=1}{x_it_i}+\sum^N_{i=1}{x_iy_ir_i}\\ \begin{align} s.t &&x_i \in \{0,1\} \tag 1\\ &&t_i \ge 0 \tag 2\\ &&A.X = b \tag 3\\ &&C_0-\sum^k_{i=1}{x_id_i}+\sum^{k-1}_{i=1}{x_iy_ir_i}\ge0 \tag 4\\ &&\sum^k_{i=1}{x_id_i}-\sum^k_{i=1}{x_iy_ir_i}\ge0 \tag 5 \end{align} $$ The constants: $$ \begin{align} C_0&&=&&consts \\ R &&=&& \{r_1,..., r_N\} \\ D &&=&& \{d_1,..., d_N\} \\ T &&=&& \{t_1,..., t_N\} \\ B &&=&& \{b_1,..., b_M\} \end{align} $$ The constraint (4) & (5) exist in N version $k={1,2,...,N}$. I would to find an algorithm, which I can implement my self without buying an expensive CPLEX or Gurobi, I need something I can implement my self.
Ideas:
- The constraints (2), (4) & (5) can be reframed as Log-Barrier
- The constraint (1) can be rewritten as $x_i^2-x_i=0$
- The objective function and the constraints (4) & (5) can be rewritten in a quadratic form
- Can we solve this problem with GD, with Newton Method/Lagrangian Multiplier?
Thanks for any ideas/clues.
Best