Non-negative linear least squares with sparse solution vector

582 Views Asked by At

Does anyone know the term or algorithm for an NNLS, wherein the solution has a constraint such that it contains only $p$ non-zero values?

For example, if the standard linear least squares problem is:

$\bf{X}\bf{\beta} = \bf{y}$

$\bf{\hat{\beta}} = (\bf{X^TX})^{-1}\bf{X}^T\bf{y}$

And we place the constraint that $\hat{\beta}\ge0$ (to create a non-negative linear least squares problem). The wonderful thing is that there are several methods out there that solve this problem quite easily, in python or MATLAB etc.


However, if we place the additional constraint that $\beta$ must have only $p$ values that are non-zero ($\beta$ is sparse), what is this type of problem named?

I am interested in this as it would be useful to select the $p$ most useful predictors in the set rather than all of them.

I thought of something similar to PCA, but this appears to be quite different.

I would assume that there is a simple way to implement this in various programming languages such as python, but I am uncertain of what to search.

1

There are 1 best solutions below

4
On BEST ANSWER

For the general case: Tikhonov regularization. For variable selection and shrinkage: LASSO.