Prove that if $f$ satisfies the Lipschitz condition, then the solutions to $\frac{dx}{dt} = f(x)$ are defined for all $t \in \mathbb R$

166 Views Asked by At

Prove that if $f : \mathbb R^n \to \mathbb R^n$ satisfies the Lipschitz condition, then the solutions to $\frac{dx}{dt} = f(x)$ are defined for all $t \in \mathbb R$.

Suppose the function $f$ satisfies the Lipschitz condition, so there is a constant $L>0$ such that for any $x,y \in \mathbb R^n$ the following holds: $$||f(x)-f(y)||\le L||x-y||.$$ The Picard-Lindelöf theorem says that if $f$ is a Lipschitz function, then for each starting point $x_0 \in \mathbb{R}^n$ there is a local solution to the differential equation $\frac{dx}{dt} = f(x )$ satisfying the initial condition $x(0) = x_0$.

Since this theorem only applies to the local existence of a solution, we need to show that we can extend this solution to the entire timeline. But I don't know how to do it.

1

There are 1 best solutions below

0
On

Generally, you can write for a solution of the ODE, where $t$ and $t_0$ are in the maximal interval of existence (wlog $t>t_0$): $$ \lVert x(t) \rVert \leq \lVert x(t_0) \rVert + \int^t_{t_0} \lVert f(x(s)) \rVert ~\mathrm{d}s \leq \lVert x(t_0) \rVert + \lvert t - t_0 \rvert \lVert f(x(t_0)) \rVert + L\int^t_{t_0} \lVert x(s) - x(t_0) \rVert~\mathrm{d}s \leq \lVert x(t_0) \rVert + \lvert t - t_0 \rvert (L\lVert x(t_0) \rVert + \lVert f(x(t_0)) \rVert) + L\int^t_{t_0} \lVert x(s) \rVert~\mathrm{d}s $$ Gronwall Lemma tells us: $$ \lVert x(t) \rVert \leq \left( \lVert x(t_0) \rVert + \lvert t - t_0 \rvert (L\lVert x(t_0) \rVert + \lVert f(x(t_0)) \rVert) \right) \exp(L\lvert t - t_0 \rvert) $$ So the norm of the solution always stays below a function that does not explode in finite time - so it can be extended on to the entire real line (imagine $t_0$ as a fixed initial time). Existence you still get from Picard-Lindelöf.