Last week I attended the last class of calculus of the semester and I got many problems left as an exercise. I have solved all of them but one. It reads as follows.
Let $n\ge 2$ and $\Omega\subset\mathbb{R}^n$ be a nonempty bounded set. Prove that there exists a proper constant $c>0$ such that for all $u\in L^2(\Omega)$ it is $$\int_{\Omega} |u|^2 dx \le c\int_{\Omega} e^{|u|^2} dx.$$ Was left as an hint to use Hölder inequality.
I am very confused about this exercise and I would ask for you help. My attempt so far is to write $$\int_{\Omega} |u|^2 dx = \int_{\Omega} e^{\log(|u|^2)} dx,$$ but I do not understand how to use Hölder inequality to get the desired result.
Anyone could help?
This is a sketch:
We can use Mclauren's identity for exponential map: $f(x) = f(0)+\frac{f'(0)}{1!}\,x+\frac{f''(0)}{2!}\,x^2+\cdots$. In that case, the map $e^y$ is equal to $1+y+\frac{y^2}{2!}+...$.
Therefore, $\int\limits_\Omega e^{\lvert{u}\rvert^2} \, du = \int\limits_\Omega (1+\lvert{u}\rvert^2+\lvert{u}\rvert^4+\cdots)\,du = \int\limits_\Omega \lvert{u}\rvert^2 \, du+ \int\limits_\Omega(1+\frac{\lvert{u}\rvert^4}{(2!)^2}+\cdots)\,du$. If we prove, the right-hand side is greater than 0, we prove the desired equality.
Does it make sense?