How do I show arithmetic mean is greater than or equal to geometric mean given non-negativity of KL divergence?
i.e. Given $D(p||q)=\sum_i p_i \log\left(\frac{p_i}{q_i}\right)\geq 0$, show that $\prod_{i=1}^n x_i^{a_i}\leq\sum_i a_ix_i,\;\forall a_i\geq 0\; s.t.\;\sum_i a_i=1$
I tried substituting $p_i=a_i$ and $q_i=x_i$ but I am only able to get G.M. on one side but not A.M. on the other.
Let's see. Taking logarithms on the target inequality, we get on the LHS $$L=\log \prod_{i=1}^n x_i^{a_i}=\sum a_i \log(x_i)$$
How can be related with the KL divergence? $a_i$ is (can be considered as) a probability function, but $x_i$ or $a_i y_i$ is not. Then let's normalize, define $Y= \sum a_i x_i$ and $y_i = a_i x_i /Y$ so that $y_i$ is a pf.
Then let's add the "desired" factors and see where it leads us to ...
$$L=\sum a_i \log\left(\frac{x_i}{Y} \frac{ a_i}{a_i} Y\right)=\sum a_i \log(y_i/a_i) + \sum a_i \log(Y) \tag 1 $$
The first term is close to what we need but we need to invert the fraction (hence change the log sign):
$$ \sum a_i \log(a_i/y_i) = \sum a_i \log(Y) -L \tag2$$
At last we can apply the KL divergence, the LHS is non-negative, then...
(I hope you can follow up from here.)