Inequality in the proof of Picard Iterations

278 Views Asked by At

Let $f(x,t)$ be continuous in $I:=|t-t_0|\leq\alpha$ and $D:=\|\ x - x_0 \|\ \leq \beta$. Also, let $f$ satisify the Lipschitz Condition in this region $D \times I$ with some Lipschitz constant $L>0$. (Here $\alpha, \beta >0$). Then, the initial value problem: $$\begin{cases} x'(t) = f(x,t) \\ x(t_0)=x_0\end{cases}$$ has a unique solution on $| t - t_0 | < \delta$.

This is an excerpt from Grimshaw's text (Theorem 1.4). In his proof uniqueness follows from a previous theorem and he focuses on existence. For showing existence he breaks it up into three steps that I will outline:

a) Iterations of the Volteria Equation, i.e. $$x(t)=x_0 + \int_{t_0}^t f(x(s),s)ds$$ Doing these iterations we have: $$x^0(t) = x_0$$ $$x^1(t) = x_0 + \int_{t_0}^t f(x^0(s),s)ds$$ $$x^2(t) = x_0 + \int_{t_0}^t f(x^1(s),s)ds$$ $$\vdots$$ $$x^{n+1}(t) = x_0 + \int_{t_0}^t f(x^n(s),s)ds \tag{1}$$ Then he proved that each iteration term is in $D$ when $|t-t_0| \leq \delta$ via induction, which was fine. I will omit that bit. The end result is that the inequality: $$\|\ x^{n+1}(t) - x_0 \|\ \leq \beta \tag{2}$$ b) Next, he showed that $\left\{ x^n(t) \right\}_0^{\infty}$ is uniformly (and absolutely) convergent on $|t - t_0| \leq \delta$, again I will omit this bit of the proof. We'll say it converges to $x(t)$.

c) Next, we note that $x^n(t)$ is a continuous vector function on $|t-t_0|<\delta$. The uniform limit of continuous vector functions is a continuous vector function. Now comes my question:

it follows from $(2)$ that $$\|\ x(t)-x_0 \|\ \leq \beta$$ for $|t-t_0|\leq\delta$.

I do not see how this follows. I tried proving the inequality by contradiction i.e. supposing $\|\ x(t) - x_0 \|\ > \beta$ but am having trouble.

d) Shows that $x(t)$ satisfies Volteria equation, just including this for completeness, but again will omit his proof.

1

There are 1 best solutions below

0
On BEST ANSWER

For $t \in [t_0 - \delta, t_0 + \delta]$ you have $x^n(t) \rightarrow x(t)$. This implies that

$x^n(t) - x_0 \rightarrow x(t) - x_0$

i.e. that

$||x^n(t) -x_0|| \rightarrow ||x(t) -x_0 || $

Note since $t$ is fixed this is just a sequence in $\Bbb R^n$. We'll show that if $y_n \rightarrow y$ (in $\Bbb R^n$ or any normed space), and $||y_n|| \leq C$ for $C \in \Bbb R$, then $||y|| \leq C$. This will imply what want with $y_n = x^n(t) - x_0$ and $\delta = C$.

By the triangle inequality,

$||y|| = ||y_n + (y - y_n)|| \leq ||y_n|| + ||y_n - y|| \leq C + ||y_n -y||$

As $n \rightarrow \infty$ the last term goes to zero.

This is an example of something you'll see as you move on, so it's good to get used to it: limits of functions satisfying a norm inequality will also satisfy the inequality. Notice that we only needed the pointwise convergence to prove the inequality also, not uniform convergence.