I have a non-convex constraint optimization problem to find its approximate solutions:
First define a parameterized function $$g(s_i) = g(s_i;\mathbf{x})= \operatorname{sigmoid}(x_0 + x_1 \cdot s_i) \cdot x_2 + x_3$$, where $\mathbf{x} = (x_0, x_1, x_2, x_3)$ is the parameters to optimize.
The values $0\leq s_i \leq 1$ comes from known and fixed data. The sigmoid function is defined as: $\operatorname{sigmoid}(x) = \frac{1}{1 + e^{-x}}$.
We also have $1 \leq i \leq n$, where $n$ is a constant value, and $C_{sum}$ is a constant that satisfies $0 < C_{sum} < n$.
The optimization problem is:
minimize:
$$f(x) = -\sum_i s_i \cdot g(s_i)$$
constraints:
$$\sum_ig(s_i) = \mathit{C_{sum}}$$
$$x_1 > 0$$
$$x_2 > 0$$
$$0 \leq g(s_i) \leq 1$$
we are trying to find the value for $\mathbf{x}$ that minimizes $f(x)$
We can assume there are less than 100 $s_i$ in the data: $1\leq i \leq 100$.
Also, any approximate solution is welcome.