This is a soft-question. Most of the value the links I see only address the optimization with equality constraints and get a solution via some analytical method such as Lagrange Multipliers. Say if I want to maximize an objective function $f (\mathbf{\theta})$ (note:$\theta$ is an n-dimensional vector) that is not necessarily convex with the equality constraints such that a subset of the parameter values sum to 1 and that these parameters are overlapping in each constraint. For example, $\theta_1+\theta_2+\theta_3=1$, $\theta_2+\theta_3+\theta_4=1$, $\theta_4+\theta_5+\theta_6=1$, $\dots$. Suppose that there are $m$ of these equations/equality constraints. My questions are
- Is there any suggestion of computational methods that deal with the optimization when $m$ is a large number? Any links or references will be appreciated
- Generally speaking, do I face any complexity issues when performing the optimization. From your experience, how large can $m$ become before it becomes computationally infeasible to optimize?
- Will I be able to achieve a global maximum?
I understand that a lot of this depends on the objective function $f(\mathbf{\theta})$. I do not know what the term is to classify $f(\mathbf{\theta} )$ but the objective function is of the shape:
$$f(\mathbf{\theta})=\frac{(\sum^n_{i=1}w_i \theta_i)^2}{n\sum_{i=1}^n(w_i\theta_i)^2}$$
where $\mathbf{w}$ are just constants