How to find minimum of L1 norm residuals?

548 Views Asked by At

I have the following task:

$$ A \in \mathbb{R}^{m\times n} ,\ b \in R^m,\ x \in R^n\ $$ $$ \begin{align} \text{minimize} && \|Ax-b\|_1 &= |r_1| + ... + |r_m| \\ \text{subject to} && r &= Ax - b. \end{align} $$ Which I found could be expressed as: $$ \begin{align} \text{minimize} && &1^Tt \\ \text{subject to} && Ax-t &\le b \\ && -Ax-t &\le -b. \end{align} $$

I don't know how to do it (I'd like to solve it with the gradient descend method, but I cannot calculate its gradient), so can someone give me a hint regarding the equation? Thank you.

2

There are 2 best solutions below

1
On BEST ANSWER

As the other commenter mentioned, Linear Programming is a common approach for L1-norm minimization. However, if this is not your forte, linear programming solvers can be difficult to use, and it may be very slow to compute.

Another approach if you insist to use gradient descent, is that you could consider iteratively reweighted least squares. This will not give an exact answer like linear programming solvers, but may be sufficient for many practical applications. However, this does suffer from numerical stability issues which can cause oscillations in between different iterations.

There are more approaches which can be considered, this report seems to have a good overview of different solution methods on which I have less knowledge.

0
On

This is a linear programming problem.

One particular algorithm to solve linear program is the simplex method.