I am trying to derive a weak formulation of the boundary value problem
$-\epsilon u′′ + bu′ = f $ for $x \in (0, 1)$, $u(0) = u(1) = 0$,:
where $\epsilon \in \mathbb{R}$, is such that $\epsilon > 0$, $b \in \mathbb{R}$ and $f \in L^2(0, 1)$. Then prove
that the problem admits a unique solution using Lax-Milgram’s lemma.
I have determined for this:
$
V=\{ϕ∈Ω|v(0)=v(1)=0\}
,a(u,v)=\int_{Ω}Du.(bv+εDv) dx
$
and $l(v)=\int_{Ω}fv dx$.
However I'm struggling to show $a( . , . )$ to be bounded, here's what I've done:
I have found that by taking the constants out and applying the Cauchy-Schwarz Inequality:
$|a(u,v)|≤||Du||_{L^2(Ω)}(b||v||_{L^2(Ω)}+ε||Dv||_{L^2(Ω)})$, however I want $|a(u,v)|≤C||u||_{H^1(Ω)}||v||_{H^1(Ω)}$, is there something I can use to achieve such a result or is what I've found wrong? I'm happy to post my working.
well you can bound both terms: $$ \|Du\|_{L^2(\Omega)} \leq \sqrt{\|u\|_{L^2(\Omega)}^2+\|Du\|_{L^2(\Omega)}^2} = \|u\|_{H^1(\Omega)} $$ and also $$ b\|v\|_{L^2(\Omega)}+\epsilon \|Dv\|_{L^2(\Omega)} \leq C (\|v\|_{L^2(\Omega)}+ \|Dv\|_{L^2(\Omega)}) $$ where $$ C = max\{b,\epsilon\} $$ and we also have $$ \|v\|_{L^2(\Omega)}+ \|Dv\|_{L^2(\Omega)} \leq 2 \max\{\|v\|_{L^2(\Omega)},\|Dv\|_{L^2(\Omega)}\} \leq 2 \sqrt{\|v\|_{L^2(\Omega)}^2+\|Dv\|_{L^2(\Omega)}^2} = 2\|v\|_{H^1(\Omega)} $$ therefore $$ a(u,v) \leq 2C \|u\|_{H^1(\Omega)} \|v\|_{H^1(\Omega)} . $$ Note that since $\epsilon >0$ we know that $C>0$.