A simple application of Jensen's inequality

153 Views Asked by At

I have been having problems while trying to prove the following inequality. Given $\{\phi_k\}_{k=1}^K$ such that $\sum_k \phi_k = 1$, we have the following inequality: \begin{align} \frac1{\sum_k x_k} \leq \sum_k \frac{\phi_k^2}{x_k}, \end{align} for some $x_k >0$, $k=1,\dots,K$.

This inequality is often used in the majorization-minimization algorithm for non-negative matrix factorization models. My notes (from $5$ years ago) tell me that this inequality can be proven by using the Jensen's inequality, but, at the moment, I cannot figure out how it can be achieved. It seems quite elementary though.

Briefly, my question is how can we prove this inequality by using Jensen's inequality (or any other approach).

2

There are 2 best solutions below

5
On BEST ANSWER

Note that, since $x_k>0$ for all $k$, setting $p_k \stackrel{\rm def}{=} \frac{x_k}{\sum_k x_k}$ the $(p_k)_k$ define a probability distribution.

Since $x\mapsto x^2$ is convex, by Jensen's inequality we have $$ \left(\sum_{k} p_k \frac{\phi_k}{x_k}\right)^2 = \mathbb{E}_{k\sim p}\left[\frac{\phi_k}{x_k}\right]^2 \operatorname*{\leq}_{\rm Jensen} \mathbb{E}_{k\sim p}\left[\frac{\phi_k^2}{x_k^2}\right] = \sum_{k} p_k \frac{\phi_k^2}{x_k^2} $$ i.e., rearranging, $$ \frac{1}{\sum_k x_k}=\frac{\left(\sum_{k} \phi_k\right)^2}{\sum_k x_k} \leq \sum_{k} x_k \frac{\phi_k^2}{x_k^2} = \sum_k \frac{\phi_k^2}{x_k} $$ as sought (the first equality as $\sum_{k} \phi_k=1$).

1
On

By Cauchy-Schwarz inequality: $$ 1 = \left(\sum \phi_k\right)^2 = \left(\sum \frac{\phi_k}{\sqrt{x_k}} \sqrt{x_k} \right)^2\leq \sum \frac{\phi_k^2}{x_k} \sum x_k $$