I am learning about Lasso Regression and came across taking gradient with respect to $0$. I came to know about subgradient but could not understand what will be its value at $0$.
In lasso regression, we use $\mathcal{L}_1$-Regularisation, which means adding the absolute values of weights or coefficients to the cost. So, In order for us to do the gradient descent algorithm, we have to take the gradients. The gradient of the absolute value function at $0$ does not exist. Then I have read about something called sub-gradients, but I could not understand much about it. For subgradients for values less than $0$, subgradient is $-1$ and for values greater than $0$, subgradient is $+1$. But I could not understand what will be the value of subgradient at $0$, for function $|x|$?
The function $f(x) = |x|$ has subtangents at $x=0$ with slopes between $-1$ and $1$. Hence, $m$ is a subgradient at $x_0=0$ if $|m|\leq 1$.
In order to derive this, we will start with the definition of the subgradient $m(x-x_0) \leq f(x)-f(x_0)$ $\forall x\in \mathbb{R}$.
$$m(x-x_0)\leq f(x)-f(x_0)$$ $$mx \leq |x|$$ $$\text{for } x>0 \text{: } m\leq 1 $$ $$\text{for } x<0 \text{: } -m\leq 1 $$ $$\text{for } x=0 \text{: } 0 \leq 0 $$ $$\implies m \in [-1,1].$$
We can also start with $|m| \leq 1$ and show that these are subgradients by deriving the definition $m(x-x_0) \leq f(x)-f(x_0)$. Assume $|m| \leq 1$ then
$$m(x-x_0)=m(x-0)=mx \leq |mx| = |m||x|\leq 1\cdot |x| =|x|-|0|=f(x)-f(x_0=0)$$ $$\implies m(x-x_0) \leq f(x) -f(x_0).$$