Let $\phi \in L^2(\Omega)$ and $u_{\phi}$ the solution of the boundary problem
$$ -u_{\phi}'' + cu_{\phi}' +u_{\phi} = \phi $$ in $\Omega$ and $u_{\phi} (0) = u_{\phi}(1) = 0$.
Let's say c is 0. How can I then show that there exists a constant C > 0 such that: $ |u_{\phi}|_{H^2(\Omega)} \leq C ||\phi||_{L^2(\Omega)}$ ?
My idea: find an inequality first for $||.||_{H^1(\Omega)}$ instead of $|.|_{H^2(\Omega)}$ by multiplying the differential equation by $u_{\phi}$ and integrating over $\Omega$. But I'm not sure at all...
The problem is how to control the second derivative, or showing $\|u''\|_{L^2}\le C\|\phi\|_{L^2}$. If you consider the simpler equation $-u''=\phi$, then the solution comes from squaring the equation and integrating: $\int (u'')^2=\int \phi^2$. We can do the same thing for your equation $-u''+u=\phi$, except now we have to account for cross terms: $$ \int ((u'')^2-2u''u+u^2)=\int\phi^2, $$ or $$ \int (u'')^2=\int\phi^2+\int 2u''u-\int u^2. $$ The last term is OK, since $-\int u^2\le 0$ (you could also argue $\int u^2\le \|u\|_{H^1}^2\le C\|\phi\|_{L^2}^2$). So the main question is $\int u''u$. There are two ways to deal with this term:
Since $u(0)=u(1)=0$, integrating by parts gives $\int u''u=-\int (u')^2\le 0$.
It's easy to see that $2ab\le a^2+b^2$, so replacing $(a,b)$ with $(\epsilon^{1/2} a,\epsilon^{-1/2}b)$ gives $2ab\le \epsilon\,a^2+\frac{b^2}{\epsilon}$. Therefore, $$ \int 2u''u\le \epsilon\int (u'')^2+\frac{1}{\epsilon}\int u^2. $$ Therefore, $$ (1-\epsilon)\int (u'')^2\le \int\phi^2+\frac{1}{\epsilon}\int u^2, \forall \epsilon>0 $$ and it's easy to get the result from here.