Let $f$ be a non-negative function that satisfies
$$f(t) \leq K_1 + \varepsilon(t-a)+K_2 \int_a^t f(s)ds$$ for $a\leq t \leq b, \quad K_1,K_2,\varepsilon \in \mathbb{R}$. Prove:
$$f(t) \leq K_1 e^{K_2(t-a)}+\frac{\varepsilon}{K_2} e^{K_2(t-a)-1}$$
My attempt:
Let $$U(t)=K_1 + \varepsilon(t-a)+K_2 \int_a^t f(s)ds, \quad U'(t)=\varepsilon t +K_2f(t)$$
then $$U'(t) \leq \varepsilon t + K_2 U(t) \iff U'(t)-K_2U(t) \leq \varepsilon t \quad \Bigg| \cdot e^{-K_2(t-a)}$$
$$\Big[e^{-K_2(t-a)}U(t)\Big]' \leq \int \varepsilon t \cdot e^{-K_2(t-a)} dt +c \iff U(t) \leq -\frac{\varepsilon}{K_2^2}(K_2t+1)+c\cdot e^{K_2(t-a)} \iff$$ $$\int_a^t f(s) ds \leq K_1-\varepsilon (t-a)-\frac{\varepsilon}{K_2^2}(K_2t+1)+c\cdot e^{K_2(t-a)}$$
but I can't derive the inequality that has to be proven. Is my $U(t)$ a bad choice?