Let $\lambda > 0$ and let $x^*$ be a minimizer for the optimization problem $$ \text{minimize} \quad \frac12 \| Ax - b \|_2^2 + \lambda \|x \|_1. $$ Here $A$ is an $m \times n$ matrix, $b \in \mathbb R^m$ and the optimization variable is $x \in \mathbb R^n$.
Under what conditions is it true that there exists $\gamma > 0$ such that $x^*$ is also a minimizer for the optimization problem $$ \tag{1}\text{minimize} \quad \frac12 \| Ax - b \|_2^2 + \gamma \|x \|_0. $$ Here $\| x\|_0$ is defined to be the number of non-zero components of $x$.
Also, I would like to know under what conditions is it true that there exists $\gamma > 0$ such that $x^*$ has the same sparsity pattern as a minimizer for problem (1).
Equivalence between $l_0$ and $l_1$ norms were studied in several papers. Most of the works are concentrated on the strong equivalence: $l_0$ and $l_1$ problems are said to be strongly equivalent if the $l_0$ problem has a unique solution which coincides with the unique solution to the $l_1$ problem.
Up to date several $l_0$ and $l_1$ strongly equivalence criteria are known:
I like the last one. The matrix $A^T$ is said to satisfy the Range Space Property of order $K$ if for any disjoint subsets $S_1$, $S_2$ of $\{1, ..., n\}$ with $|S_1| + |S_2| \leq K$, there is a vector $\eta \in R(A^T)$ such that $$ \eta_i = 1 \forall i \in S_1 {;}\> \eta_i = -1 \forall i \in S_2 {;}\> \text{otherwise } |\eta | \leq 1 $$
The full details one can find in the presentation Yun-Bin Zhao, "Efficiency of $l_1$-minimization for $l_0$-minimization problems: Analysis via the Range Space Property", 2013/2014