The ordinary least squares (OLS) method is very useful. It gives you the solution to the problem
$$ \arg \min_{x} {\left\| A x - b \right\|}_{2}^{2} $$
Now, if the problem is the same, but the $1$-norm is used instead:
$$ \arg \min_{x} {\left\| A x - b \right\|}_{1} $$
Is there a known (approximate or not) solution to that problem? Any time efficient algorithm to get this optimum? I've read of the Theil-Sen estimator which should do the trick in dimension $2$, and some multidimentionnal extension of it, but the algorithm computation time increases hugely with dimension, I don't think I'll get any solution before a year if I use that.
The problem is given by:
$$ \arg \min_{x} {\left\| A x - b \right\|}_{1} $$
If we define $ r = A x - b $ we can rewrite the problem as:
$$\begin{align*} \arg \min_{x, t} \quad & \boldsymbol{1}^{T} t \\ \text{subject to} \quad &-t \preceq A x - b \preceq t \end{align*}$$
This can be easily solved by any Linear Programming solver.
You may see some more related approaches in my answer to Mathematics Q1639716 - How Can $ {L}_{1} $ Norm Minimization with Linear Equality Constraints (Basis Pursuit / Sparse Representation) Be Formulated as Linear Programming?