Vector density as a function of sparsity

54 Views Asked by At

Many works denote $\phi(x)$ as density of a vector. $$ \phi(x) = \frac{||x||_1^2}{d||x||_2^2} $$ where $x \in \mathbb{R}^d$.
Can we say that this quantity decreases as we increase sparsity? For example, assume that $f_s(x)$ returns a vector that is equal to $x$ in at most $s$ coordinates and zero elsewhere. Can we show that $\phi(f_s(x))$ is increasing in $s$?

1

There are 1 best solutions below

0
On

For the given $f_s(x)$ there is a situation when $\phi(f_s(x))$ is decreasing in $s$. Consider $f_2(x) = (1,1, 0, \cdots, 0)$ and $f_3(x) = (1,1,5,0,\cdots,0)$. In this case, $\phi(f_2(x)) = \frac{2}{d}$ and $\phi(f_3(x)) = \frac{49}{27d} < \phi(f_2(x))$.