I am having some confusion related to the calculation of gradient.

My function $f(X) = g(X) + \lambda||X||_1$ where g(X) is convex and differentiable. I didn't get how the second expression when $X_{ij}=0$ is derived
I am having some confusion related to the calculation of gradient.

My function $f(X) = g(X) + \lambda||X||_1$ where g(X) is convex and differentiable. I didn't get how the second expression when $X_{ij}=0$ is derived
Well I found the reason. It's related to minimum norm subgradient http://www.di.ens.fr/~mschmidt/Documents/2011_OPT_Chapter.pdf.