Prove that $f(t)$ satisfies the following inequality

61 Views Asked by At

Let $f(t)$ be a non-negative function which satisfies: $$f(t) \leq K_1 + \varepsilon (t-a)+K_2 \int_a^t f(s) ds $$ $$a\leq t \leq b, \quad\varepsilon,K_1,K_2 >0$$ Prove that: $$ f(t) \leq K_1 e^{K_2(t-a)}+\frac{\varepsilon}{K_2}e^{K_2(t-a)-1}$$ Attempt:

If we set $\,u=t-a$, the first inequality can be rewritten as $$ f(u) \leq g(u)+\int_0^{u+a} f(s) ds, \quad u\in[0,b-a] $$ My suggestion is to use the Gronwall Lemma which says:

Let $φ(t)$ be a continuous function in $[0,T]$. Suppose that $\exists k>0$ and another continuous function $f(t)$, such that :

$$ φ(t) \leq f(t) + k\int_o^tφ(u)du, \forall t \in [0,T]$$

Then : $$φ(t) \leq f(t) + k\int_0^t f(τ)e^{t-τ}dτ,\forall t \in [0,T]$$

However, the definite integral we got has upper limit $u+a$ instead of $u$, so the Lemma can't be applied directly.

Is there anything we can do to transform it?

1

There are 1 best solutions below

0
On BEST ANSWER

Simple calculation error - your integral before setting $u=t-a$ is over an interval of length $t-a$, so after the change of variables $v=s-a$ (which is a determinant 1 transformation) you should still have an integral over an interval of size $t-a=u$.

Let $F(u) = f(u+a) = f(t)$. Then $$ \int_{s=a}^{s=t} f(s) ds = \int_{s-a=0}^{s-a=u} f(s-a+a) ds \overset{v=s-a}= \int_0^u f(v+a) dv = \int_0^u F(v) dv$$ So for $t-a=u \in [0,b-a]$, the inequality reads

$$ F(u) \le K_1 + \epsilon u + K_2 \int_a^t f(s) ds \\ \iff F(u) \le K_1 + \epsilon u + K_2 \int_0^u F(v) dv $$