Assume that $$f(t) \le K\int_a^t f(s)\, ds, \qquad\text{for all $\,t \in [a,b]$.} $$ for some constant $K$, where $f:[a,b] \rightarrow [0,\infty)$.
Let $U(t) = K\int_a^t f(s) ds$.
If $U(a) = 0$ and $U(t) \ge 0$ and $U(t)e^{-K(t-a)}$ is a decreasing function, then why is $f(t) = 0$?
This proof from our book seems quite sketchy, but maybe I am missing something obvious?
First, we obtain that $$ \exp(-Kt)\left(f(t)-K\int_a^t f(s)\,ds\right)\le 0, $$ which is equivalent to $$ \left(\exp(-Kt)\int_a^t f(s)\,ds\right)'\le 0, $$ and integrating over $[a,t]$ we get that $$ \exp(-Kt)\int_a^t f(s)\,ds-\exp(-Ka)\int_a^a f(s)\,ds\le 0, $$ or that $$ \int_a^t f(s)\,ds\le 0, $$ for all $t\ge a$. But, $f(t)\ge 0$, and hence $f\equiv 0$.