Inequality involving average value of a function

71 Views Asked by At

Consider the Integral $$\frac{1}{r}\int_{0}^{r}f(x)dx.$$ My observation, after testing several elementary functions, is that if $f(x)$ is a decreasing function, then $$f(r)\leq\frac{1}{r}\int_{0}^{r}f(x)dx\quad,~\forall 0\leq r\in\mathbb{R}.$$ For instance, if $f(x)=e^{-x}$ on [0,1], we have $$f(1)=\frac{1}{e}\approx 0.37,\quad\frac{1}{1}\int_{0}^{1}e^{-x}dx=1-\frac{1}{e}\approx 0.63$$

Is there a concrete way to prove the above inequality?

2

There are 2 best solutions below

0
On

Hint:

As the function is decreasing, $f(r)$ is the minimum value of the function in $[0,r]$. The integral you have proposed is the average of the function in that interval. The average value of a function can never be lesser than the least value. This is the inequality you have written.

0
On

The mean value theorem states that: $$\frac1{b-a}\int_a^bf(x)dx=f(c)\quad c\in(a,b)$$ Notice that if $f$ is a decreasing function i.e. $$f'(x)\le0\quad\forall x$$ then for any point $x<b$ it is a given that $f(x)>f(b)$ Letting $a=0,b=r$ we get: $$\frac1r\int_0^rf(x)dx=f(c)\quad c\in(0,r)$$ knowing that $f(x)>f(r)$ you should be able to see that the extreme case would be a constant function, hence why I have treated this as an open domain since $c\ne0,r$ is $f$ is decreasing at any point and cannot be increasing.