I've given $f \in L^2(\Omega), g \in H^1(\Omega)$. I want to find $u \in H^1(\Omega)$ such that $$ -div (A \nabla u ) + <b, \nabla u> + cu = f$$ in $\Omega$ $$u = g$$ on $\Gamma$.
In order to find u, I thought it was a good idea to substitute u by $u=u_0 +g$ and to reformulate the problem as a variational problem for $u_0 \in H^1_0(\Omega)$.
But at the moment, I got stock a bit. Is the "substitution" a good idea?
Yes, this is a good idea.
More precisely with $u = u_0+g$ the equation becomes: $$-\operatorname{div}(A \nabla u_0) +\langle b, \nabla u_0 \rangle+cu_0=\tilde{f}$$ where: $$\tilde{f}=f+\operatorname{div}(A \nabla g) -\langle b, \nabla g \rangle-c g \in H^{-1}$$
this is then a "classical" variational problem in $H^1_0$.