Picard Iteration, existence of a solution to an IVP

583 Views Asked by At

This a follow-up question from my previous post: ODE analysis problem

Although it may not be useful, I still place the link above.

Questions: An iterated sequence is given:

$x_0(t)=0$

$x_n(t)=\cos(t)+\int_{0}^{t}s^2\sin(s-t)x_{n-1}(s) ds$

Show that for any $T>0$, the sequence of functions $\{x_n(t)\}_{n=1}^{\infty}$ converges uniformly on $[0,T]$ to a limit function $x_{\infty}(t)$, and it is a solution to the IVP $y''(t)+(1+t^2)y(t)=0, y(0)=1, y'(0)=0$.

As I read some "very simple" example from my notes ($x'(t)=x, x(0)=1$), it is easy to follow the step by finding the first few terms of the sequences and guess the entire sequence when $n\rightarrow \infty$. However, this question should not be as simple as the example. At least, I think what I need to do is to use "analysis" to study the iterated sequence. But I don't know how to start the analysis. Can anyone help me?

My thought up till now:

$$\lim_{n\rightarrow\infty} x_n(t)=\cos(t)+\lim_{n\rightarrow\infty}\int_{0}^{t}s^2\sin(s-t)x_{n-1}(s)ds$$ $$\lim_{n\rightarrow\infty} x_n(t)=\cos(t)+\cos(t)\lim_{n\rightarrow\infty}\int_{0}^{t}s^2\sin(s)x_{n-1}(s)ds-\sin(t)\lim_{n\rightarrow\infty}\int_{0}^{t}s^2\cos(s)x_{n-1}(s)ds$$ If I can show the sequences are uniformly convergent, then I can switch the limit sign and the integration sign on the RHS. But...still, I was stuck. As the function inside is continuous, if I can switch the limit and integration sign, I can further put the limit sign into the function.

Is it related to Picard-Lindelof's Existence Theorem? Stating that $|x_n-x_{n-1}| \leq \frac{KL^{n-1}|t|^n}{n!}$. And I read books and find that this theorem is valid for any $t\in [-\epsilon, \epsilon]$. Is the something related to these stuff? And how to extend to the interval $[0,T]$

1

There are 1 best solutions below

3
On BEST ANSWER

To explore the Lipschitz behaviour of this integral operator $P$, $$(Py)(t)=\cos(t)+\int_0^t\sin(t-s)s^2y(s)\,ds,$$ which operates on continuous functions $y:[0,\infty)\to \Bbb R$ compute first the growth behaviour of a (hypothetical) solution resp. fixed point of $P$. Bounding the trigonometric functions by $1$ gets $$ |y(t)|\le 1+\int_0^ts^2|y(s)|\,ds $$

Then apply the Grönwall lemma which gives a bound $|y(t)|\le v(t)$ for this integral inequality where $v$ is the solution of the corresponding integral equation, which is equivalent to the IVP $v'(t)=t^2v(t)$, $v(0)=1$. This has the solution $v(t)=\exp(t^3/3)$.


Having obtained this structure for the growth of a solution, consider a modified norm which concentrates on the region close to $0$. $$ \|y\|_3=\sup_{t\ge 0}e^{-t^3}|y(t)|. $$ The relevant maximum value is taken somewhere around $0$, as in the product the values of $y$ at larger $t$ are viciously scaled down.

Then the pointwise difference of two values of $P$ is $$ |(Py)(t)-(Px)(t)|\le \int_0^t s^2\,e^{s^3}\|y-x\|_3\,ds =\frac{e^{t^3}-1}3\,\|y-x\|_3 \\ \implies \|(Py)(t)-(Px)(t)\|_3\le\frac13\,\|y-x\|_3 $$ so that $P$ has Lipschitz constant $\frac13$ in this norm. Per the Banach fixed-point theorem this contraction on the space of continuous functions has a fixed point which is the limit of the Picard iteration $x_{n+1}=P(x_n)$. The constructed norm is equivalent to the supremum norm on any bounded interval $[0,T]$. This means that the convergence is uniform on such bounded intervals. This can be used to exchange limit and integral to show that the fixed point is indeed a solution of the integral equation.