Least squares wih variable bounds

29 Views Asked by At

Consider the following variant of the least squares problem

$$\begin{array}{ll} \underset{}{\text{minimize}} & \|y - X \beta\|_{2}^{2}\\ \text{subject to} & \beta \in \left[0,1 \right]^{n}, \end{array}$$

where $X \in \mathbb{R}^{m \times n}, y \in \mathbb{R}^{m}$ and $X$ has full column rank. Without restriction on $\beta$ the unique solution is given by $$ \beta^{0} = (X^{T}X)^{-1}X^{T}y. $$ Is there also a closed form solution for the restricted problem? If not, what is the best approach to tackle this problem. I think one way is proximal gradient method. My idea is first to solve the unconstrained version (i.e., project the vector on the column space of $X$) giving the vector $a = X\beta^{0}$ and then find the vector $\tilde\beta$ which minimizes $$\begin{array}{ll} \underset{}{\text{minimize}} & \|a - X \beta\|_{2}^{2}\\ \text{subject to} & \beta \in \left[0,1 \right]^{n}. \end{array}$$ Or is the solution given by

$\tilde\beta_{i} = \left\{ \begin{array}{ll} 0 & \beta_{i}^{0} < 0 \\ 1 & \beta_{i}^{0} >1 \\ \beta_{i}^{0} & \beta_{i}^{0} \in [0,1]^{n}. \end{array} \right. $

Here I denote with $\beta^{0}$ the solution of the least squares problem without constraints given above.