Optimality check for non-differentiable convex function

36 Views Asked by At

I have a doubt on how to check if a given point of a convex, but nondifferentiable, function is a minimum/maximum. For instance, let's say that I have the following function:

$$ f \left( x \right) = \left\Vert x A - b \right\Vert^{2}_{2} + \lambda \left\Vert x \right\Vert_{1} $$

where both $x$ and $b$ are row vectors. If my calculations are correct, the subdifferential of $f$ with respect to its argument should be the set

$$ \begin{align*} \frac{\partial f \left( x \right)}{\partial x} = \left\lbrace 2 \left( x A - b \right) A^{T} + \lambda \, v \right\rbrace & & \text{where} ~v_{i} = \begin{cases} -1, & x_{i} < 0 \\ 1, & x_{i} > 0 \\ \left[ -1, 1 \right], & x_{i} = 0 \end{cases} \end{align*} $$

Now, if I define $\text{sign} \left( \cdot \right)$ as $$ \text{sign} \left( x_{i} \right) = \begin{cases} -1, & x_{i} < 0 \\ 1, & x_{i} > 0 \\ 0, & x_{i} = 0 \end{cases} $$

wouldn't it be enough to check if a given $\tilde{x}$ is such that

$$ \left\Vert 2 \left( \tilde{x} A - b \right) A^{T} + \lambda \, \text{sign} \left( \tilde{x} \right) \right\Vert^{2}_{2} = 0 $$

to conclude that it is the minimum/maximum?

Thank you for your attention and forgive my possible linguistic errors, but this is not my main language.