How do I show the following inequality?
Let $\Omega = (0,1) \subset R^1$
Let $u(x) \in C^1(\overline \Omega)$ with the property that $\int^b_au(x)dx=0$
Then, there exists a constant $C>0$ such that $||u(x)||_{L^2(\Omega)} \leq C||u'(x)||_{L^2(\Omega)}$
I was able to write $\int^b_a|u(x)|dx \leq ||u'(x)||_{L^2(\Omega)}||1||_{L^2(\Omega)} = ||u'(x)||_{L^2(\Omega)}(1-0)^{1/2} = ||u'(x)||_{L^2(\Omega)}$ by using Cauchy-Schwarz inequality, but I could not proceed any further.
I think that I somehow have to use the mean value theorem for integrals to relate $\int^b_a|u(x)|dx$ with $\int^b_a|u(x)|dx$.
An easy proof: Take $v=u^2$ so that $v'=2uu'$. Therefore the fundamental theorem of calculus gives $$ u^2(x)\leq \int_0^1 2|u(t)||u'(t)|dt\leq 2 \| u\|_{L^2}\| u'\|_{L^2}. $$ Integrating over $x$ we get the result with $C=2$.