A problem of Volterra integral equation.

97 Views Asked by At

How can I prove this problem?:

Consider the Volterra integral equation $$x(t)=g(t)+\int_{[a,b]}f(s,x(s))ds, \quad t \in [a,b],$$where $g \in C([a,b];\mathbb{R}^{n})$, $f:[a,b]\times \mathbb{R}^{n} \to \mathbb{R}^{n}$ is continuous and $$||f(s,x)-f(s,y)||\leq L||x-y||, \quad \forall s \in [a,b], x \in \mathbb{R}^{n}.$$Prove that there is a unique solution $x\in C([a,b];\mathbb{R}^{n})$.

My attempt: Consider the sequence of functions $x_{n}: I \to \mathbb{R}$, $n=0,1,\ldots,$ defined iteratively as follows $$x_{0}(t)=g(t), \quad \forall t \in I,$$ $$x_{n+1}(t)=g(t)+\int_{[0,t]}f(s,x_{n}(s))ds, \quad t \in [a,b] \wedge \forall n=0,1,\ldots$$Now, I need to prove that $x_{n+1}$ is uniformly convergent.

But I don't know how to continue from there. Initially I thought that this is not Volterra equation and I was thinking this is nonlinear inhomogeneous Fredholm integral equation, and therefore there are a few assumptions to make the existence of such solutions (not for all $L$), but I don't know.