$$ \max_{\mu} \sum_{i=1}^n \log(x_i \cdot \mu)\qquad\text{with}\qquad \sum _{j=1}^b \mu_j = 1,\qquad \mu_i \ge 0,\qquad x_{ij} \ge 0 $$ The function is shown as above, where $x_i$ and $\mu$ are vectors. The $x_i$ is fixed, and I need find the optimal $\mu$. I try to apply the Lagrange multiplier method, while get stuck.
For example, $x_1 = [0,0.1,0.2,0.7]$,$x_2 = [0.1,0,0.1,0.8]$,.. $\mu = [0,0,0.5,0.5]$
In addition, in my context, all $\sum_{j=1} x_{ij} = 1$. Though i think this condition is not necessary, it may convert the original question to other form. When all $x_i = [0,0,..1,0,0]$, i solved the $\mu_j = \frac{1}{n}\sum_{i=1}^n x_{ij}$ using properties of information entropy.
I guess that $\mu_i\geq 0$ for all $i$. You can reformulate the problem as
$$ \max \sum_{i=1}^n \log( y_i )\\ s.t.\\ y_i = x_i\cdot \mu_i \quad i=1,\ldots,n\\ \sum_{j=1}^k \mu_i = 1\\ \mu_i \ge 0 \quad i=1,\ldots,n $$
Assuming that $x_i$ are such that $y_i\geq 0$ for all $i$, this is a convex problem you can solve with a standard optimizer.