If I have some integer $C$, for simplicity $C>100$, and I decompose $C$ into two integers $a$ and $b$,
then $\sqrt{a}\ln{a} + \sqrt{b}\ln{b}$ is maximized for $a=b=C/2$
My question is how I could show this for the case of decomposing $C$ into $2 < k < e^2$ integers? I.e. that an even decomposition $x_i = C / k$ maximizes $\sum_{i=1}^k \sqrt{x_i} \ln{x_i}$ versus any other possible decomposition?
I am not sure of how to even approach this question, so any help is appreciated
This follows from Jensen's inequality and the fact that $x\mapsto \sqrt x\ln x$ is concave.