I got the following differential equation:
$u'(t) = 1+\dfrac{u(t)}{t}$ with $u(1) = 2$. This inhomogeneous differential equation is in the format of $u'(t) = u(t)a(t) + b(t)$ with $a(t) = 1/t$ and $b(t) = 1$. Now we can solve the equation using the formular for variation of parameters.
We do get $u(t) = t(\log (t) +2)$. This implies that the interval of existance for $u(t)$ is $(0, \infty)$. Now I want to use the picard-lindelöf theorem to show that this solution is unique.
$\textbf{Question 1:}$ Do I have to calculate $u(t) = t(\log (t) + 2 )$ first?
$\textbf{Question 2:}$ Now I want to show that $u'(t) = f(t,u(t))$ is lipschitz continous in its second component. For that I do calculate $\mid \partial_uf(t,u(t)) \mid$ and show that this partial derivative is bounded. Im not exactly sure how to get the derivative in respect of $u$. Can someone give me please a general advise on this and then complete my calculation?
No, you don't need the solution in hand to prove that the solution is unique.
$\partial_u f(t,u)=1/t$ which is uniformly bounded on $[a,\infty)$ for any $a>0$. To push things to $(0,\infty)$ you need to invoke a local Lipschitz version of Picard-Lindelof.