I am currently reading a paper in which it is assumedly obvious (no (partial) proof) that said statement is true.
The statement reads as follows (we want to maximize $A$ with respect to $X_i$):
'' The maximum value with the respect of a quantity of the form $A = \sum (Y_i * \log(X_i) )$ subject to the constraint $\sum X_i = X$ ,is obtained by putting $X_i = (X/Y) * Y_i$ , where $Y=\sum Y_i$ .
This can be shown directly from the convexity of the logarithm. ''
I am totally lost here, I seem to not even be able to differentiate the 'A-sum' (maybe someone can start giving me some help here/show me the differentiation of the sum).
Under the assumption $Y_i \ge 0$, I think this can be derived from Gibbs' inequality. For a pair of discrete distributions $P=\{p_1,\dots,p_n\}$ and $Q=\{q_1,\dots,q_n\}$, the inequality states that \begin{equation} \sum_{i=1}^{n}{p_i log(p_i)} \ge \sum_{i=1}^{n}{p_i log(q_i)} \end{equation} Note that necessarily $X_i \ge 0$ to allow $log(X_i)$ Denoting $x_i = \frac{X_i}{X}$ and $y_i = \frac{Y_i}{Y}$, you have that $\{x_1,\dots,x_n\}$ and $\{y_1,\dots,y_n\}$ are discrete distributions (nonnegative and sum to 1), so Gibbs' inequality gives \begin{equation} \sum_{i=1}^{n}{y_i log(x_i)} \leq \sum_{i=1}^{n}{y_i log(y_i)} \end{equation} And this upper bound is achieved by $x_i = y_i$, thus $X_i = x_i \cdot X = \frac{Y_i}{Y} \cdot X$.