Suppose that the average value on all intervals $[a,b]$ is equal to $f((a+b)/2)$. Prove that $f''(x) = 0$ for all $x \in \mathbb{R}$

102 Views Asked by At

I understand that $f(x)$ must be linear with a first derivative equal to a constant. I'm just not sure how I can use the mean value property of integrals to show something about $f''(x)$. The hint on this question is to use the fundamental theorem of calculus or Jensen's inequality.

2

There are 2 best solutions below

2
On BEST ANSWER

Assuming $f$ is integrable and twice differentiable (otherwise your statement about average value doesn't make sense, nor your final statement*), $$\int_a^bf(x)\,\mathrm dx=(b-a)f\left(\cfrac{a+b}{2}\right)$$

Differentiate both sides w.r.t $\,b$, using the Leibniz integral rule (derived from fundamental theorem of calculus) for the LHS:

$$f(b)=\frac{b}{2}f'\left(\cfrac{a+b}{2}\right)+f\left(\cfrac{a+b}{2}\right)-\frac{a}{2}f'\left(\cfrac{a+b}{2}\right)$$

Now set $b=0$ and $a=2x$:

$$f(0)=f(x)-xf'(x)$$

Differentiate both sides w.r.t $x$:

$$0=f'(x)-f'(x)-xf''(x)$$

so $f''(x)=0$ for all $x\neq 0$.

Thus we have proven the function is linear everywhere except $0$. Since $f'(0)$ and $f'(x)$, $f'(-x)$ are constants for $x>0$ exists, we know $f'(0)$ has to be equal to each of these and thus $f''(0)=0$.

*I'm not sure if you're able to prove that $f$ has to be twice differentiable.

0
On

I will assume that $f$ is locally integrable for an obvious reason. Our aim is to prove that $f$ is linear, which is then enough to conclude that $f$ is twice-differentiable with $f'' \equiv 0$.

Let $x < y$ and $0 < \lambda < 1$ be arbitrary. Set

$$ c = \lambda x + (1-\lambda) y, \qquad a = 2x - c, \qquad b = 2y - c. $$

Then $a < x < c < y < b$ and

$$ \frac{a+b}{2} = (1-\lambda)x + \lambda y, \qquad \frac{a+c}{2} = x, \qquad \frac{c+b}{2} = y. $$

So it follows that

\begin{align*} f((1-\lambda)x+\lambda y) &= \frac{\int_{a}^{b} f(t) \, \mathrm{d}t}{b-a} \\ &= \frac{\int_{a}^{c} f(t) \, \mathrm{d}t + \int_{c}^{b} f(t) \, \mathrm{d}t}{b-a} \\ &= \frac{(c-a)f(x) + (b-c)f(y)}{b-a} \\ &= (1-\lambda) f(x) + \lambda f(y). \end{align*}

This proves that $f$ is linear.