Constrained optimization problem with multiple variables.

77 Views Asked by At

I am looking for the solution to a constrained maximization problem,

\begin{align} f(w_1,w_2,\dots,w_n) = &\prod_{i=1}^{m}(C_{i1}w_1+C_{i2}w_2+\dots+C_{in}w_n)\notag\\ \text{such that}\;\; &\sum_{j=1}^{n}w_j = 1\notag \end{align}

where $m, n, C_{ij}$ are all constants.

I want to find the $w_j (j=1,\dots,n)$ to maximize $f$.

Now I have 2 thoughts:

  1. If $f$ is convex, use gradient descent. I'm not sure if it's possible with the constraint on $w$.
  2. $f$ looks similar to the likelihood function for maximum likelihood estimation. Not sure if it's possible to use related algorithm like EM. I'm struggling to connect $f$ to it.

If both ways don't work, is there any other method to solve it?

1

There are 1 best solutions below

0
On BEST ANSWER

According to the Lagrange multipliers technique, calling

$$P(\omega) = \Pi_{i=1}^m\sum_{j=1}^n c_{ij}\omega_j$$

and forming the lagrangian

$$ L(\omega,\lambda) = P(\omega)+\lambda \left(\sum_{j=1}^n \omega_j-1\right) $$

the stationary points are the solutions to

$$ \cases{ \displaystyle\sum_{k=1}^m\frac{c_{k \nu}}{\sum_{j=1}^n c_{k j}\omega_j}-\mu=0\\ \displaystyle\sum_{j=1}^n \omega_j-1=0 } $$

with the new multiplier $\mu = \frac{\lambda}{P(\omega)}$