Consider the Integral $$\frac{1}{r}\int_{0}^{r}f(x)dx.$$ My observation, after testing several elementary functions, is that if $f(x)$ is a decreasing function, then $$f(r)\leq\frac{1}{r}\int_{0}^{r}f(x)dx\quad,~\forall 0\leq r\in\mathbb{R}.$$ For instance, if $f(x)=e^{-x}$ on [0,1], we have $$f(1)=\frac{1}{e}\approx 0.37,\quad\frac{1}{1}\int_{0}^{1}e^{-x}dx=1-\frac{1}{e}\approx 0.63$$
Is there a concrete way to prove the above inequality?
Hint:
As the function is decreasing, $f(r)$ is the minimum value of the function in $[0,r]$. The integral you have proposed is the average of the function in that interval. The average value of a function can never be lesser than the least value. This is the inequality you have written.