Least Absolute Deviation (LAD) with Non Negative Constraint

770 Views Asked by At

I would like to solve the following:

$$ \begin{align} \text{minimize} & \quad & \left\| A x - b \right\|_{1} \\ \text{subject to} & \quad & x \succeq 0 \end{align} $$

What kind of toolkit should we use to solve this problem?
I know we can turn this into a linear programming problem. Can that be done by just adding another constraint?

1

There are 1 best solutions below

2
On

The problem is given by:

$$ \begin{align} \text{minimize} & \quad & \left\| A x - b \right\|_{1} \\ \text{subject to} & \quad & x \succeq 0 \end{align} $$

The easiest (Yet one of the slowest) methods to solve this would be using the Projected Sub Gradient Method.

The Gradient of $ \left\| A x - b \right\|_{1} $ is given by $ {A}^{T} \operatorname{sgn} \left( A x - b \right) $.
The projection onto the non negative orthant is given by $ {x}_{+} $.

Here is the code:

vX = zeros([numCols, 1]);

for ii = 1:numIterations
    vX = vX - ((stepSize / sqrt(ii)) * mA.' * sign(mA * vX - vB));
    vX = max(vX, 0);
end

objVal = norm(mA * vX - vB, 1);

disp([' ']);
disp(['Projected Gradient Solution Summary']);
disp(['The Optimal Value Is Given By - ', num2str(objVal)]);
disp(['The Optimal Argument Is Given By - [ ', num2str(vX.'), ' ]']);
disp([' ']);

The full code (Including verification against CVX) can be found in my Mathematics Exchange Q1385984 GitHub Repository.

Remarks:

  • You certainly can transfer this problem into another form (Like Linear Programming Problem) and solve it more efficiently. Yet this will require a solver.
  • Other method based on $ \operatorname{Prox} $ operator or Primal - Dual Method are also applicable and efficient in this case.