Bound on the solution of a constrained least squares problem

375 Views Asked by At

Let $A \in \mathbb{R}^{m \times n}$ be a full row rank matrix and $b \in \mathbb{R}^m$. I want to show that there exists $K>0$ depending only on $m, n, A, b$ such that, for any vector $\alpha \in \mathbb{R}^n_{\ge 0}$, the optimization problem $$ \min_{x\in \mathbb{R}^n} \frac{1}{2}\|Ax-b\|^2,\\ \text{s.t.}\ \ -\alpha\le x \le \alpha $$ has a solution $x^*$ such that $\|x^*\|\le K$.

By $-\alpha\le x \le \alpha$, I mean that for any $i=1,\cdots,n$, we have $-\alpha_i\le x_i \le \alpha_i$. Any hint is appreciated.

1

There are 1 best solutions below

3
On

The minimum norm least squares solution to $\min \| Ax -b \|_{2}^{2}$ is given by the pseudoinverse solution

$x^{\dagger}=A^{\dagger}b$

For values of $\alpha$ greater than or equal to $L=\| x^{\dagger} \|_{\infty}$, $x^{\dagger}$ will be inside the box and will be an optimal solution.

For values of $\alpha$ less than $L$, the box is contained within a Euclidean ball of radius $K=\sqrt{nL^2}$.

Either way, an optimal solution to the box-constrained problem lies within a Euclidean ball of radius $K=\sqrt{nL^{2}}$.

Furthermore, this is the best possible bound since for large values of $\alpha$ there will be no optimal solution with a smaller norm than $x^{\dagger}$.