$\sum_{i=1}^{n}(c_i\log_2 x_i)$ biggest when $x_i = \frac{c_i}{\sum_i c_i}$

77 Views Asked by At

$x_1 >0,x_2>0,...,x_n>0$$\sum_i x_i=1$$c_i >0$,proof

$$ \sum_{i=1}^{n}(c_i\log_2 x_i) $$

max when $$x_i = \frac{c_i}{\sum_i c_i}$$

I found it similar to Jensen's inequality.

but I could only do this:

let $f(x)=\log_2(x)$, $C=\sum_i c_i,d_i=c_i/C$ , to scale $d_i$ into (0,1]

$$ C\sum_i d_i f(x_i)\leq Cf\left(\sum_i d_i x_i\right) $$

but the right side of $\leq$ has $x_i$, so it's not correct.

3

There are 3 best solutions below

0
On BEST ANSWER

We define the objective function as $$ f(x) = c_1 \log_2 x_1 + \cdots + \log_2 x_n $$ where $c_i > 0$ for all $i = 1, 2, \ldots, n$.

We define the constraint function as $$ g(x) = x_1 + x_2 + \cdots + x_n - 1 = 0. $$

We form the Lagrangian function as $$ L(x, \lambda) = f(x) - \lambda g(x) = \left[ c_1 \log_2 x_1 + \cdots + \log_2 x_n \right] - \lambda [x_1 + x_2 + \cdots + x_n - 1] $$

Kuhn-Tucker Necessary conditions for optimality are:

(1) ${\partial L \over \partial x_i} = 0$ for $i = 1, 2, \ldots, n$.

(2) ${\partial L \over \partial \lambda} = 0$.

From (1), we get a set of $n$ equations given by $$ {c_i \over x_i \ln 2} - \lambda (1) = 0 \ \ \mbox{for} \ \ i = 1, 2, \ldots, n $$ or $$ x_i = {c_i \over \lambda \ln 2} \ \ \mbox{for} \ \ i = 1, 2, \ldots, n \tag{*} $$

From (2), we get $$ x_1+ x_2 + \ldots + x_n = 1 $$

Using (*), we get $$ \left( {c_1 \over \lambda \ln 2} + \cdots {c_n \over \lambda \ln 2} \right) = 1 $$

Hence, we must have $$ {1 \over \lambda \ln 2} \sum\limits_{i = 1}^n c_i = 1 $$

Hence, $$ \lambda = {1 \over \ln 2} \sum\limits_{i = 1}^n c_i \tag{**} $$

Substituting the value of $\lambda$ from (**) into (*), we get the optimal solution as $$ x_i = {c_i \over \sum\limits_{i = 1}^n c_i} \ \ \mbox{for} \ \ i = 1, 2, \ldots, n $$

0
On

We want to maximize $$S=\sum c_i \log x_i=\sum \log x_i^{c_i}=\log \prod x_i^{c_i} $$ Because $e^x$ is an increasing function, we can maximize $$e^S=\prod x_i^{c_i} = \prod \left( \frac{x_i}{c_i}\right)^{c_i} \prod c_i^{c_i} $$ Using weighted AM-GM and the fact that $\sum x_i=1$: $$e^S \leqslant \frac{\sum c_i\left( \frac{x_i}{c_i}\right)}{\sum c_i} \prod c_i^{c_i} = \frac{\prod c_i^{c_i}}{\sum c_i} $$ With equality iff all of the $\frac{x_i}{c_i}$ are equal and so $x_i=\frac{c_i}{\sum c_i}$.

0
On

Let $S = c_1 + c_2 + \cdots + c_n$ and $$a_i = \frac{c_i}{S}, \, i = 1, 2, \cdots, n.$$ Then $a_i > 0, \forall i$ and $a_1 + a_2 + \cdots + a_n = 1$.

Since $y\mapsto \log_2 y$ is concave, we have \begin{align*} \sum_{i=1}^n c_i\log_2 x_i &= S \sum_{i=1}^n a_i\log_2 x_i\\ &= S \sum_{i=1}^n a_i \left(\log_2 \frac{x_i}{a_i} + \log_2 a_i\right)\\ &= S \sum_{i=1}^n a_i \log_2 \frac{x_i}{a_i} + S \sum_{i=1}^n a_i \log_2 a_i\\ &\le S \log_2 \left(\sum_{i=1}^n a_i \cdot \frac{x_i}{a_i}\right) + S \sum_{i=1}^n a_i \log_2 a_i\\ &= S \log_2 1 + S \sum_{i=1}^n a_i \log_2 a_i\\ &= S \sum_{i=1}^n a_i \log_2 a_i\\ &= \sum_{i=1}^n c_i \log_2 a_i. \end{align*}

Also, when $x_i = a_i, \forall i$, we have $\sum_{i=1}^n c_i\log_2 x_i = \sum_{i=1}^n c_i \log_2 a_i$.

Thus, the maximum of $\sum_{i=1}^n c_i\log_2 x_i$ is $\sum_{i=1}^n c_i \log_2 \frac{c_i}{\sum_j c_j}$ which is attained when $x_i = \frac{c_i}{\sum_j c_j}, i=1, 2, \cdots, n$.