I just learned that a continuously differentiable function $f$ is strictly concave if $f(x+z)<f(x)+f'(x)z$ holds strictly for all $x$ and $z\neq 0$. So, this is a sufficient but not necessary condition. In other words, if $f$ is strictly concave, I cannot conclude that $f(x+z)<f(x)+f'(x)z$ holds strictly for all $x$ and $z\neq 0$. But I don't know why.
I just cannot come up an example that when a continuously differentiable $f$ is strictly concave, it can have $f(x+z)=f(x)+f'(x)z$ for some $z\neq 0$ at some x.
Could you please help me with this?
Thanks a lot!
Hi all,
I would like to thank Kavi's very patient answer. I guess I understand the theorem wrongly. And it should be a sufficient and necessary condition, so we do not need the example.
Suppose equality holds with $z>0$. Let $S(y)=\frac {f(y)-f(x)} {y-x}$ for $x <y \leq x+z$. It is well known (and fairly easy to check from definition of concavity) that $S$ is monotonically decreasing. Its limiting value at $y=x$ is $f'(x)$ and we are assuming that its value at $x+z$ is also $f'(x)$. Hence $S$ is a constant on $[x,x+z]$. But that contradicts the fact that $f$ is strictly concave. A similar argument works when $z <0$. Hence the example you are looking for does not exist.