How to solve the function $\max \sum_{i=1}^n \log(x_i \cdot \mu)$ with $\sum _{j=1}^b \mu_j = 1$

214 Views Asked by At

$$ \max_{\mu} \sum_{i=1}^n \log(x_i \cdot \mu)\qquad\text{with}\qquad \sum _{j=1}^b \mu_j = 1,\qquad \mu_i \ge 0,\qquad x_{ij} \ge 0 $$ The function is shown as above, where $x_i$ and $\mu$ are vectors. The $x_i$ is fixed, and I need find the optimal $\mu$. I try to apply the Lagrange multiplier method, while get stuck.

For example, $x_1 = [0,0.1,0.2,0.7]$,$x_2 = [0.1,0,0.1,0.8]$,.. $\mu = [0,0,0.5,0.5]$

In addition, in my context, all $\sum_{j=1} x_{ij} = 1$. Though i think this condition is not necessary, it may convert the original question to other form. When all $x_i = [0,0,..1,0,0]$, i solved the $\mu_j = \frac{1}{n}\sum_{i=1}^n x_{ij}$ using properties of information entropy.

2

There are 2 best solutions below

3
On

I guess that $\mu_i\geq 0$ for all $i$. You can reformulate the problem as

$$ \max \sum_{i=1}^n \log( y_i )\\ s.t.\\ y_i = x_i\cdot \mu_i \quad i=1,\ldots,n\\ \sum_{j=1}^k \mu_i = 1\\ \mu_i \ge 0 \quad i=1,\ldots,n $$

Assuming that $x_i$ are such that $y_i\geq 0$ for all $i$, this is a convex problem you can solve with a standard optimizer.

2
On

There is a closed form solution for your problem under the following assumption: $x_i > 0, \mu_i > 0 $, for all $i$.

You can rewrite $\sum_{i=1}^N \log(x_i \mu_i) = \log(\prod_{i=1}^N x_i \mu_i)$. Using this identity, it is clear that your problem is equivalent to $$ \max c \prod_{i=1}^N \mu_i~~~{\rm s.t.} ~~ \sum_{j=1}^N \mu_i = 1,$$ where $c = \prod_{i=1}^N x_i$. The equivalence follows from the monotonicity of the logarithm, or, in other words, the logarithm is maximized if its argument is maximized.

The solution to the equivalent problem is given by $\mu_i = 1/N.$