I'm trying to solve this boundary value problem and need help.
Let $a,b,f \in L^{\infty}((0,1))$. Assuming that a $\gamma >0$ exists so that $a(x) \geq \gamma $ for almost all $x \in (0,1)$, solve $$a(x)u''(x) + b(x)u'(x) = f(x) \quad\forall x \in (0,1)$$ $$u(0)=0,\quad u(1)=1$$ Note: You should be able to use the identity $h''-g'h' = e^g (e^{-g} h')'$.
How to start? I think you're meant to use the identity at the start and I'm not sure how because the problem doesn't have that form.