A convex function on an interval $ I $ is said to be convex if for every $ 0 < t < 1 $ and $ x,y \in I $ we have that $ f(tx + (1-t)y) \le tf(x) + (1-t)f(y)$.
Prove a function is convex if and only if for every $ a < c < b $ in $ I $ we have $ \frac{f(c)-f(a)}{c-a} \le \frac{f(b)-f(c)}{b-c}$.
After some failed attempts with messy algebraic manipulations, I managed to come up with a more geometric proof for "only if" part, but can't figure out the other direction.
Any hints on proving the "if" part and, possibly, the "only if" part by algebraic manipulations on the inequalities?
$$ a<c<b \Leftrightarrow c=t\cdot a +(1-t)\cdot b \text{ where } t=(b-c)/(b-a) $$
$$ \dfrac{f(t\cdot a +(1-t)\cdot b)-f(a)}{(1-t)(b-a)}\le \dfrac{f(b)-f(t\cdot a +(1-t)\cdot b)-f(a)}{t(b-a)} $$
The rest are straight forward manipulations each one with a if and only if till you get the convexity definition.