I am trying to show that the following equation has a unique solution in $[0,1]$. $$ -u''+u=f~~in~~ \Omega=(0,1) $$ and boundary conditions are $u(0)=u(1)$ and $u'(0)=u'(1)$.
Firstly, I found the weak formulation of the above equation and I got, $$ a(u,v) = \int_{}^{1}(u'v'+uv)dx $$ and $$ L(v) = \int_{0}^{1}vdx $$ where $v \in V=\{v \in H^{1}([0,1])|~v(0)=v(1)~\}$. To apply the Lax-Milgram theorem, I am little confused in showing the coercive by using the Poincare inequality. $$ a(v,v)=\int_{0}^{1}(|v'|^2+v^2)dx \ge \int_{0}^{1}|v'|^2dx\ge\alpha \|v\|_{H^1(\Omega)}^2 $$ To apply the Poincare inequality, I need to take $v \in H_0^1(\Omega)$.
Can anyone let me know how to justify this step?
You don't need to use Poincare here, $a(u,u)$ is already the square of the norm in $H^1(0,1)$. There is no reason to drop the $v^2$ and then try to find a lower bound that includes it, just leave it there.