Are there convex functions for which hessian is not defined, but the gradient is defined everywhere?
I was looking at projected gradient descent, as well as Newton's method for solving optimization problem involving linear constraint. In order to justify using projected gradient descent despite it's problem with ill-conditioning, I was looking at cases where function might be convex and differentiable, but not have a hessian defined.
Let $f$ be any nondecreasing and continuous function on $\mathbb R$ that has points of nondifferentiability. Then $F(x)=\int_0^x f$ is an example. You can get pretty wild with this $f;$ for example there is one whose corresponding $F$ fails to be twice differentiable at each rational.