I've been reading "A text book on ordinary differential equations" of Sahir Ahmad & Ambrossi. On page 37 we have the following excercise:
"Explain why $x'+\dfrac{\sin(t)}{e^t+1}x=0$ cannot has solution $x(t)$ such that $x(1)=1$ and $x(2)=-1$."
As far I know, $f(t,x)$ is continuos everywhere since $e^t\ne -1$ for every $t$, and $\frac{\partial f}{\partial x}$ doesn't depends on $x$, so its constant and therefore continous, then the existence and unicity theorem must apply on neighborhood of those points. Where am I wrong? Can anyone explain me? Im very stuck af this assignament.
Let $F$ be an antiderivative of the function $ \frac{ \sin t}{e^t+1}.$ Then the general solution of the differential equation is given by
$$x(t)=C e^{-F(t)},$$
where $C \in \mathbb R.$
From $1=x(1)= Ce^{-F(1)}$ we get $C>0.$
But from $-1=x(2)=Ce^{-F(2)}$ we get $C<0.$
Contradiction !