I am a bit uncertain as to how one would solve equations that uses the sign function. In this case the sign function is applied component wise. Here is an example of the type of equations I am looking at: $$$$
$l \ \Sigma^{-1}(\mathbf{c} - \mathbf{x}) - sign(\bf{x}) = \mathbf{0}$
where $\Sigma \in \mathbb{R}^{p \times p}$ matrix and is symmetric and fully invertible, $\bf{x,c} \in \mathbb{R}^{p \times 1}$ vector, $\mathbf{0} \in \mathbb{R}^{p \times 1}$ is a null vector, and $l \in \mathbb{R}$ is some real valued constant $$$$ I do know that for a Single variate case one might consider the definition of $sign(X)$. Here is an example:
$2x - sign(x) = 0$. $$$$
For this equation we know that:
$sign(x) = 1$ if $x > 0$ ,
$sign(x) = -1$ if $x < 0$,
or $sign(x) = 0$ if $x = 0$. $$$$
This would naturally result in three solutions:
$x = 0$ or $x = \frac{1}{2}$ or $x = -\frac{1}{2}$
$$$$
I am bit confused as to what one would do in the multivariate case.
I know one could rewrite the equation as:
$ \mathbf{x} = \mathbf{c} -\frac{ \Sigma \ sign( \mathbf{x} )}{l}$
$$$$ One could easily solve for each component in the vector $\mathbf{x}$ using the above mentioned single variable case.
$$$$
My question is, is there an alternative method to deal with these types of equations involving $sign(\mathbf{x})$ and is there a way to express the solution in a vector/matrix closed form.
Thank you for your help, and my apologies for breaking any posting rules. This is my first post.