Prove that a function sequence involing recursion and integral is convergent

213 Views Asked by At

Let $f, g : [0, 1] \rightarrow \mathbb R$ be continuous functions. Define $x_n(t) = f(t) + \int_0^t x_{n-1}(s)ds,$ $0 \le t \le 1, n=1,2,3,...$, where $x_0(t)=g(t), 0 \le t \le 1$. I have to show that the sequence $(x_n)$ is uniformly convergent on $[0, 1]$ and its limit is independent of $g$.

What I have done:

  • Since $f, g$ continuous on $[0, 1]$ by extreme calue theorem they attain a maximum on that interval, which means $\exists F,G$ constants such that: $|f(x)| \le F, |g(x)| \le G, \forall x \in [0,1]$.
  • Doing some calculations based on the above formula for $x_n(t)$ I have:

$|x_0(t)| \le |g(t)| \le G$

$|x_1(t)| = |f(t) + \int_0^t x_{0}(s)ds| \le |f(t)| + \int_0^t |x_{0}(s)|ds \le F+Gt$

$|x_2(t)| \le F+Ft+G\frac{t^2}{2}$

$|x_3(t)| \le F+Ft+F\frac{t^2}{2}+G\frac{t^3}{2*3}$

So, $|x_n(t)| \le F \sum_{k=0}^{n-1} \frac{t^k}{k!}+G \frac{t^n}{n!}$, which can also be written as: $|x_n(t)| \le Fe^t+G \frac{t^n}{n!}$, since $e^t=\sum_{k=0}^{\infty} \frac{t^k}{k!}$

  • Where do I go from here? How can I prove the $|x_n(t)-x(t)|<\epsilon$ relationship of the uniform convergence?
2

There are 2 best solutions below

10
On BEST ANSWER

Define the following sequence of functions, ${F_n}$ and ${G_n}$ as follows: $$F_0(t) = f(t), \ \ F_n(t) = \int_0^tF_{n-1}(s)ds$$ $$G_0(t) = g(t), \ \ G_n(t) = \int_0^tG_{n-1}(s)ds$$ Note then that: $$x_0(t) = G_0(t)$$ $$x_1(t) = f(t) + \int_0^t g(s)ds = F_0(t) + \int_0^tG_0(s)ds = F_0(t) + G_1(t)$$ And proceeding, inductively if $$x_{n-1}(t) = \sum_{k=0}^{n-2}F_k(t) +G_{n-1}(t)$$ then $$x_n(t) = f(t) +\int_0^t x_{n-1}(s)ds = F_0(t) + \int_0^t [\sum_{k=0}^{n-2}F_k(s) +G_{n-1}(s)]ds \\ = F_0(t) + \sum_{k=0}^{n-2}\int_0^tF_k(s)ds + \int_0^t G_{n-1}(s)ds \\ = F_0(t) + \sum_{k=0}^{n-2} F_{k+1}(t) + G_n(t) \\ = F_0(t) +\sum_{k=1}^{n-1} F_{k}(t) + G_n(t) $$ And thus for all $n \geq 1$ $$x_n(t) = \sum_{k=0}^{n-1}F_k(t) + G_n(t) $$ Let $M_1,M_2$ be the maximum values of $f,g$ on $[0,1]$ respectively. Then $$\left |F_0(t) \right | \leq M_1$$ $$|F_1(t)| = |\int_0^tF_{0}(s)ds| \leq M_1t$$ Proceeding this way, we get $$|F_n(t)| \leq M_1\frac{t^n}{n!} $$ and similarly $$|G_n(t)| \leq M_2\frac{t^n}{n!} $$ So, $\sup_{t \in [0,1]}|G_n(t)| \leq M_2/n!$ and thus $G_n$ converges uniformly to $0$. Now define $$H_n(t) = \sum_{k=0}^{n-1}F_k(t)$$ Given $\epsilon > 0$, for all sufficiently large $m,n$ and $m \geq n$we have $$|H_m(t) -H_n(t)| = |\sum_{k=n}^{m-1}F_k(t)| \leq \sum_{k=n}^{m-1}|F_k(t)| \leq M_1\sum_{k=n}^{m-1}\frac{t^k}{k!} \leq M_1\sum_{k=n}^{m-1}\frac{1}{k!} < \epsilon$$ for all $t \in [0,1]$. Thus $H_n$ converges uniformly, suppose, to $H$.

Recall that $x_n(t) = H_n(t) +G_n(t)$ being the sum of two uniformly convergent sequences of functions must also be uniformly convergent and the uniform limit must be $$\lim_{n \to \infty }H_n + G_n = H + 0 =H$$ Note that $H_n$ is independent of $g$ and since $G_n$ converges to $0$, $x_n$ converges uniformly to a limit independent of $g$.

13
On

Since $C[0,1]$ is a complete metric space with respect to the sup norm, we can use the contraction mapping principle.

To spell it out, let $T: C[0,1] \to C[0,1]$ map $x(t) \mapsto f(t) + \int_0^t x(s) ds$. $T$ itself isn't a contraction mapping, but $T^2$ is.

Let's prove this. First, let's work out what $T^2$ does: $$T^2(x)(t) = f(t) + \int_0^t ds f(s) + \int_0^t ds \int_0^s du \ x(u) \\ = f(t) + \int_0^t ds f(s) + \int_0^t du \ \int_u^t ds x(u) \\ = f(t) + \int_0^t ds f(s) + \int_0^t du \ (t-u) x(u) $$ Hence $$|T^2(x_1)(t) - T^2(x_2)(t)| \leq \int_0^t du \ (t-u) |x_1(u) - x_2(u) | \\ \leq \sup_{t \in [0,1]} |x_1(t) - x_2(t) | \int_0^t du \ (t-u) \\ \leq \sup_{t \in [0,1]} |x_1(t) - x_2(t) | \times \frac {t^2} 2.$$ So $$\sup_{t \in [0,1]}|T^2(x_1)(t) - T^2(x_2)(t)| \leq \frac 1 2 \sup_{t \in [0,1]} |x_1(t) - x_2(t) |$$ Thus $T^2$ is a contraction mapping.

By the contraction mapping theorem, $(x_0, x_2, x_4, x_6, \dots)$ and $(x_1, x_3, x_5, x_7, \dots)$ both converge uniformly; moreover, they converge to the same limit and this limit is independent of the choice of $g$. So the original sequence $(x_0, x_1, x_2, \dots)$ also converges uniformly to a limit that is independent of $g$.