How to work with subdifferentials in convex optimization problems?

149 Views Asked by At

I am trying to build a basic algorithm for convex optimization able to work with non-differentiable functions but I have a doubt. Wherever I read it says that basically the idea is working with the subdifferentials, so instead of testing whether the gradient of the function is close to zero at a point $x_1$, one should test if the value $0$ is included in the subdifferential of the function at that point.

I get the mathematical concept of this, but I have no idea on how to code it. While working with differentiable functions, the search criteria is to test the gradient value and check if it gets close to $0$, but for example, if I consider the absolute value function $\vert x\vert$, I dont know how to define the search criteria.