Consider the initial value problem $y'(t,y(t))=t+sin(y(t))$ with initial condition $y(2)=1$. Find the largest interval $\mathcal{I}\subset \mathbb{R}$ containing $t_0=2$ such that the problem has a unique solutions $y$ in $\mathcal{I}$.
I've been trying to apply Picard-Lindelof theorem to this problem and I got that $M=\underset{R}{\sup}y'(t,y(t))=a+2+\sin(b+1)$. Where $R=\{(t,y):|t-2|\leq a, |y-1|\leq b \}$.
I know that the interval is $\mathcal{I}=[2-\varepsilon, 2 + \varepsilon]$ where $\varepsilon = \min(a, b/M)$. I also know that in order for $\mathcal{I}$ to be the largest possible, $a$ must equal $b/M$.
As $M$ depends on $a$, you cannot make $a$ to big without $b/M$ getting too small. This looks like a maximization problem here. I ended up getting a very nasty value for $\varepsilon$. So I believe I am wrong.
Anyone have any hints?