Solving $l_{\infty}$ projection estimate using linear programming

62 Views Asked by At

Given a set D of data points ($y_k,x_k$), the $l_\infty$ projection estimate is defined as

$$\phi_p(D)=arg \min_{\theta}\ \max_{(y_k,x_k)\in D}|y_k-\phi_k^{'}\theta|$$

where $y_k$ is a scalar, $\phi_k^{'}=[x_k^{'}\ 1]$ is a $1\times2$ vector and $\theta$ is a $2\times 1$ vector, how to use linear programming solve this problem? I didn't see any constraint in this problem. Any help will be appreciated!

1

There are 1 best solutions below

0
On

Something like: $$\begin{align} \min \>& L\\ & -L \le y_k-\phi_k^{'}\theta \le L \>\>\forall k\end{align}$$ where $L$ is an additional variable (it can be either a free or a non-negative variable: we automatically have $L \ge 0$).