How to minimize the $0$-"norm" with quadratic constraint?

94 Views Asked by At

I have a vector

$$y = Ax + n$$

where vectors $x, y, n$ are $25 \times 1$, matrix $A$ is $25 \times 25$ and near-orthogonal (actually, it's a part of the DFT matrix). Also, $x$ is sparse and has only $5$ non-zero elements. $A$ and $y$ are known, $n$ is the Gaussian random variable, and I need to recover $x$ as accurately as possible.

How to solve the problem? We don't care about the complexity, the only consideration is the accuracy. Thanks in advance.

1

There are 1 best solutions below

6
On

Suppose you know the sparsity pattern of $x$ (enumeration works here), then the problem reduces to $y=Ax+n$ where $x$ is $5 \times 1$, $y$ and $n$ are $25\times 1$ and $A$ is $25\times 5$. My assertion is that the best $x$ is the one for that maximizes the maximum likelihood of $n$: $$\max_{n,x} \left\{ (2\pi)^{-\frac{k}{2}}(\det\Sigma)^{-\frac{1}{2}} e^{-\frac{1}{2}\left(n-\mu\right)\Sigma^{-1}\left(n-\mu\right)} : y = Ax+n \right\},$$ or that minimizes the negative log likelihood: $$\min_{n,x} \left\{ \frac{k}{2}\log(2\pi) + \frac{1}{2}\log\det\Sigma + \frac{1}{2}\left(y-Ax-\mu\right)\Sigma^{-1}\left(y-Ax-\mu\right) \right\}.$$ Assuming that $n\sim N(0,I)$, this simplifies to: $$\min_{x} \left\{ ||Ax-y||^2 \right\}.$$ So, just do least squares estimation of the linear model $y=Ax$, and select the sparsity pattern for which $||Ax-y||$ is as small as possible.