This link discusses different ways of writing a classic LAD regression with a linear program. The classic way of writing LAD regression ($y = X \beta + r$) as a linear program is \begin{equation} \min_{r, \beta} \sum_i r_i \end{equation} \begin{equation} \text{subject to } -r \leq y - X \beta \leq r \end{equation}
It is mentioned that the dual problem is \begin{equation} \max_{d} y^T d \end{equation} \begin{equation} \text{subject to } X^T d = 0 \text{ and } -1 \leq d_i \leq 1 \end{equation}
My question is simple: if you solve the dual problem (it is much faster computationally), how exactly do you recover $\beta$?