A vector $g \in \mathbb{R^D}$ is a subgradient of a function $L$ in a point $w$ if $L(u) \ge L(w) + g^{T}(u-w)$ $\forall u$.
It is clear what the sugradient of absolute function is $-1$ if $x < 0$, $[-1,1]$ if $x=0$ and $1$ if $x > 0$. Now I'm asked to compute the subgradient of mean absolute error $\frac{1}{N}\sum_{i=1}^{N}|y_n -f(x_n)|$ where $f$ is a linear regression function.
How should I proceed to calculate it?
Edit:
I forgot to add that my notes suggest to use the "chain rule"
Let $h(x) = \|x\|_1 = |x_1| + \cdots + |x_N|$. Then it can be shown that the subdifferential of $h$ is given by
$$ \partial h(x) = \{g \,:\, \|g\|_\infty \leq 1,\,g^Tx = \|x\|_1\} $$
Writing the mean absolute error as $e(w) = (1/N)h(Xw - y)$, where I have assumed that $f(w) = Xw$, and observing that $h$ is convex, it follows by subgradient calculus that
$$ \partial e(w) = \frac{1}{N}X^T\partial h(Xw - y) $$