$x_1 >0,x_2>0,...,x_n>0$,$\sum_i x_i=1$,$c_i >0$,proof
$$ \sum_{i=1}^{n}(c_i\log_2 x_i) $$
max when $$x_i = \frac{c_i}{\sum_i c_i}$$
I found it similar to Jensen's inequality.
but I could only do this:
let $f(x)=\log_2(x)$, $C=\sum_i c_i,d_i=c_i/C$ , to scale $d_i$ into (0,1]
$$ C\sum_i d_i f(x_i)\leq Cf\left(\sum_i d_i x_i\right) $$
but the right side of $\leq$ has $x_i$, so it's not correct.
We define the objective function as $$ f(x) = c_1 \log_2 x_1 + \cdots + \log_2 x_n $$ where $c_i > 0$ for all $i = 1, 2, \ldots, n$.
We define the constraint function as $$ g(x) = x_1 + x_2 + \cdots + x_n - 1 = 0. $$
We form the Lagrangian function as $$ L(x, \lambda) = f(x) - \lambda g(x) = \left[ c_1 \log_2 x_1 + \cdots + \log_2 x_n \right] - \lambda [x_1 + x_2 + \cdots + x_n - 1] $$
Kuhn-Tucker Necessary conditions for optimality are:
(1) ${\partial L \over \partial x_i} = 0$ for $i = 1, 2, \ldots, n$.
(2) ${\partial L \over \partial \lambda} = 0$.
From (1), we get a set of $n$ equations given by $$ {c_i \over x_i \ln 2} - \lambda (1) = 0 \ \ \mbox{for} \ \ i = 1, 2, \ldots, n $$ or $$ x_i = {c_i \over \lambda \ln 2} \ \ \mbox{for} \ \ i = 1, 2, \ldots, n \tag{*} $$
From (2), we get $$ x_1+ x_2 + \ldots + x_n = 1 $$
Using (*), we get $$ \left( {c_1 \over \lambda \ln 2} + \cdots {c_n \over \lambda \ln 2} \right) = 1 $$
Hence, we must have $$ {1 \over \lambda \ln 2} \sum\limits_{i = 1}^n c_i = 1 $$
Hence, $$ \lambda = {1 \over \ln 2} \sum\limits_{i = 1}^n c_i \tag{**} $$
Substituting the value of $\lambda$ from (**) into (*), we get the optimal solution as $$ x_i = {c_i \over \sum\limits_{i = 1}^n c_i} \ \ \mbox{for} \ \ i = 1, 2, \ldots, n $$