Least squares regression with constraining an arbitrary index of the solution to be $0$

27 Views Asked by At

Suppose you are given $X \in \mathbb{R}^{d \times n}$ and $Y \in \mathbb{R}^{d \times n}$ data points and you are interested in regressing $X$ onto $Y$:

$$A = \arg\min ||AX - Y||_2^2$$

such that $A$ is a $d \times d$ matrix. This is essentially a matrix version of the canonical least-squares when we solve for a vector $x$ using the equation $x = \arg\min ||Hx-b||_2^2$.

If I want to constrain say the $(i,j)$ index of $A$ to be $0$ is there a way to do this? For example, say I want to add the constraints:

  • $A_{12} = 0$
  • $A_{56} = 0$

how would I go about doing this in the matrix setting? I know in the vector setting, we could augment the $H$ to have additional rows denoting the unit-vector selecting the $i$th element of $x$ you want to set to $0$, and then adding a row to $b$ equaling $0$.

Or how would I go about "encouraging" that the matrix has a value close to 0?