With the condition $$ 0\le f(y)-f(x)-\langle f'(x),y-x\rangle \le \frac{L}{2} \|x-y\|^2, $$ I want to prove $f'(x)$ is Lipschitz with constant $L$ and is convex.
It is easy to see that $f $ is convex from the first inequality. From the second inequality I derived $$ \langle f'(y)-f'(x),y-x\rangle \le L \|y-x\|^2 $$ by exchanging the role of $x$ and $y$ and add the two inequalities. However I don't know what to do next. Could anybody tell me?
Thanks.
EDIT: As it stands, this answer only holds for the 1-dimensional case
so you have
$$ 0<\langle f'(y)-f'(x),y-x\rangle \leq L \|y-x\|^2\\ \Leftrightarrow\\ 0<[f'(y)-f'(x)](y-x) \leq L \|y-x\|^2 $$ and now you do a case study, case $1):y>x$ and case $2):y<x$.
In case $1)$ we get by dividing $(y-x)$ $$ 0<[f'(y)-f'(x)](y-x) \leq L \|y-x\|^2 \\ \Leftrightarrow \\ 0<[f'(y)-f'(x)] \leq L \|y-x\| $$ which is the same as in case $2)$ (because $0<[f'(y)-f'(x)](y-x)$), so you are done.
bests