Consider the Banach space $C[0,1]$ with the uniform norm and the operator given by $(T(f))x=x_0+\int_0^xf(t)(1-f(t))\,dt$ for $x\in[0,1]$ and $x_0\geq0$.
I want to show that there is a number $0<L<1$ sucht that $|(Tf-Tg)(x)|\le L\lVert f-g\rVert_{\infty}$ for $f,g\in C[0,1]$.
I have done the following:
$|(Tf-Tg)(x)|=|\int_0^xf(t)(1-f(t)) - g(t)(1-g(t))\,dt|$,
I am stuck at this point, since I do not know how to find a good estimate for the integrand.
Can someone help?
What you want to show is not true.
To see this, note that this would imply (by taking $g \equiv 0$) that $|Tf(x)|$ grows at most linearly in $f$. But since the integrand is quadratic in $f$, it is easy to see that this cannot hold (take $f \equiv c$ for large $c$).