I’m trying to prove a version of Jensen’s inequality, but I end up with the wrong result. I’d appreciate any help or comments.
The theorem states: let $\varphi :{{R}^{k}}\to R$ be convex. Then for any function $g\in {{L}^{1}}\left( \Omega ,{{R}^{k}} \right)$ where $\Omega $ is a bounded open subset of ${{R}^{n}}$, we have $\varphi \left( \int_{\Omega }{g\left( x \right)d\lambda } \right)\le \frac{1}{\lambda \left( \Omega \right)}\int_{\Omega }{\varphi \left( g\left( x \right) \right)d\lambda }$.
My reasoning is as following:
(1) Since $\varphi $ is convex, $\partial \varphi \left( {{x}_{0}} \right)$ (for any ${{x}_{0}}\in {{R}^{k}}$) is a nonempty subset of ${{\left( {{R}^{k}} \right)}^{*}}\cong {{R}^{k}}$.
(2) Let $\xi \in \partial \varphi \left( {{x}_{0}} \right)$, then $\varphi \left( x \right)-\varphi \left( {{x}_{0}} \right)\ge \left\langle \xi ,x-{{x}_{0}} \right\rangle $, and in particular (substituting $g\left( x \right)$ for $x$) we get $\varphi \left( g\left( x \right) \right)-\varphi \left( {{x}_{0}} \right)\ge \left\langle \xi ,g\left( x \right)-{{x}_{0}} \right\rangle $.
(3) Now it seems it’s only a matter of choosing the proper ${{x}_{0}}$ and integrating both sides over $\Omega $. Denoting $M:=\int_{\Omega }{d\lambda }$ and choosing ${{x}_{0}}:=\frac{1}{M}\int_{\Omega }{gd\lambda }$ leads to $\int_{\Omega }{\left\langle \xi ,g\left( x \right)-{{x}_{0}} \right\rangle d\lambda }=0$, which seems like the best option here.
(4) Left-hand side: $\int_{\Omega }{\left( \varphi \left( g \right)-\varphi \left( {{x}_{0}} \right) \right)d\lambda }=\int_{\Omega }{\varphi \left( g \right)d\lambda }-\int_{\Omega }{\varphi \left( {{x}_{0}} \right)d\lambda }=\int_{\Omega }{\varphi \left( g \right)d\lambda }-\varphi \left( {{x}_{0}} \right)M$.
(5) We get: $\varphi \left( \frac{1}{M}\int_{\Omega }{gd\lambda } \right)\le \frac{1}{M}\int_{\Omega }{\varphi \left( g \right)d\lambda }$ which looks almost right. We could have also plugged in earlier $Mg\left( x \right)$ instead of $g\left( x \right)$, and end up with $\varphi \left( \int_{\Omega }{gd\lambda } \right)\le \frac{1}{M}\int_{\Omega }{\varphi \left( Mg \right)d\lambda }$.
So what am I missing? I thought I should maybe reuse the convexity of $\varphi $ to convert the inequalities I can get to the inequality I want to get, but without additional assumption on $\Omega $ and $M$ that will lead to superadditivity, I don’t see I how to progress any further.