Variation of Jensen-Inequality

411 Views Asked by At

I just read a variation of Jensen's Inequality which states:

If $f: \mathbb{R} \rightarrow \mathbb{R} $ is a convex function, $ \phi \in \mathcal{L}^1(\mathbb{R}^n)$ with $ \phi \geq 0$ and $ \int \phi dx=1$ then:

$$ f \Big( \int_{\mathbb{R}^n} u(x)\phi(x)dx \Big) \leq \int_{\mathbb{R}^n} f(u(x))\phi(x)dx $$

for any $u: \mathbb{R}^n \rightarrow \mathbb{R}$ for which the integral makes sense. [1]

They don't offer a proof and I was not able to solve this. Does anyone have proof which is based on the "original" Jensen-Inequality?

[1] http://www.win.tue.nl/~mpeletie/Research/Papers/PeletierPlanqueRoeger07.pdf

2

There are 2 best solutions below

0
On BEST ANSWER

It is the original Jensen's inequality applied to the real line endowed with the probability measure $\mu(A):=\int_A\phi(x)\mathrm dx$ (the assumptions guarantee that $\mu$ is a probability measure).

0
On

Assume, to make things simpler, that $f$ has a derivative.

As $f$ is convex, for each $a$:

$$ f(y) \ge (y-a)f'(a) + f(a) $$

so in particular, as $\phi\ge 0$: $$ \int f(u(x))\phi(x) dx \ge \int \left[ (u(x)-a)f'(a) + f(a)\right] \phi(x) dx \\= \int (u(x)-a) \phi(x) dx \times f'(a) + f(a) $$

Now with $a = \int u(x)\phi(x) dx$, $ \int (u(x)-a) \phi(x) dx = 0$ and you get your result.


In the general case, a convex function on the real line has a right derivative, and you also have the inequality $$ f(y) \ge (y-a) \lim_{h\downarrow 0} \frac{f(a+h)-f(a)}h + f(a) $$ and so the proof remains valid.