Let $f(x)$ be a continuous ,twice differentiable, function such that $f(1)=f(0.5)=f(0)=0$ and $\forall x\in [0,1] f''(x)\neq -\infty$ then $f(x)$ cannot be strictly concave on the interval $[0,1]$
First remark this problem create discontinuity for some $x\in[0,1]$ and some function $f(x)$.By example the function $f(x)=-e^{\frac{-1}{x(1-x)\ln(2x)}}$ is not a counter example because $f''(0.5)=-\infty$.
Furthermore I have tried to apply the Wirtinger's inequality wich states :
$$\pi^2\int_{0}^{a}|f(x)|^2dx\leq a^2\int_{0}^{a}|f'(x)|^2dx$$
With $f(0)=f(a)=0$ and some condition on $f(x)$.
Because if we know the behavior of $f'(x)$ it implies the behavior of $f''(x)$
I recognize we have few stuffs to prove it .
Finally I have tried to apply the following statement (p.5 in this paper):
$$\frac{f(a)+f(b)}{2}-\frac{1}{b-a}\int_{a}^{b}f(x)dx=\frac{(b-a)^2}{16}\int_{0}^{1}(1-t^2)\Big(f''\Big(\frac{1+t}{2}a+\frac{1-t}{2}b\Big)+f''\Big(\frac{1-t}{2}a+\frac{1+t}{2}b\Big)\Big)dt$$
Without success unfortunately .
Without all this material I think that there exists a simple proof of this fact .
So How to solve it ?
Any helps is highly appreciated .
Thanks a lot for all your contributions.
I was thinking something along the lines of using Rolle's Theorem. Since $f$ is continuous and differentiable and $f(0) = f(0.5) = f(1) = 0$, by Rolle's Theorem $\exists c \in (0, 0.5)$ and $d \in (0.5,1)$ such that $f'(c) = f'(d) = 0$. Since the derivative is continuous as well, $\exists e \in (c,d) $ such that $f''(e) = 0$. For the function to be strictly concave, $f''(x) < 0$ $\forall x \in (0,1)$ which is not true by the above argument.
Hope this helps.