Linear independence based on support

73 Views Asked by At

Let $S:\mathbb{R}^N\to\{0,1\}^N$ be defined as $$S_i(x)=\begin{cases} 1&\text{ if }x_i\neq0\\ 0&\text{ otherwise}. \end{cases}$$ If $S(x_1)$and $S(x_2)$ are linearly independent, is it guaranteed that $x_1$ and $x_2$ are also linearly independent? How do we extend this to $N$ vectors?

I can intuitively see this for small vectors. I wrote a small Matlab code to check this for $N=5,6,\dots,10$, and it works.

count = 0;
N=100;
m=25;
for i=1:10000
    S=double(randn(N,m)>0);
    X=randn(N,m);
    D=S.*X;
    if rank(D)<rank(S)
        count=count+1;
    end
end
count

How do we formally prove the result?

2

There are 2 best solutions below

6
On BEST ANSWER

I interpret the question as if $\{ S(x_1), S(x_2)\}$ is linearly independent, then is it true that $\{x_1, x_2\}$ linearly independent?

Suppose $c_1 x_1 + c_2 x_2=0$.

If $c_1=0, c_2 \ne 0$, then $x_2=0$, and hence $S(x_2)=0$ and we can't have $\{S(x_1), S(x_2)\}$ linearly independent. Similarly if $c_1\ne 0, c_2=0$.

Suppose that $c_1 \ne 0, c_2 \ne 0$, then we have $x_1 = \frac{c_2x_2}{c_1}$, then we have $x_{1j}=0 \iff x_{2j}=0$ and hence we must have $S(x_1)=S(x_2)$ which again contradicts that $\{S(x_1), S(x_2)\}$ are linearly independent.

Hence we must have $(c_1, c_2)=(0,0)$, that is $\{x_1, x_2\}$ must be linearly independent.

Edit:

The result is not true for $3$ vectors.

Consider $x_1 =\begin{bmatrix} 1 \\ 0 \\ -1\end{bmatrix}, x_2 =\begin{bmatrix} 1 \\ 1 \\ 0\end{bmatrix}, x_3 =\begin{bmatrix} 0 \\1 \\ 1\end{bmatrix}$, they are linearly dependent since $x_1=x_2-x_3$.

However $\{S(x_1), S(x_2), S(x_3)\}$ is linearly independent.

Edit 2:

It is not true even if we impose the condition that $x_i \ge 0$. Keeping my $x_2, x_3$, just change the $x_1$ in the previous counter example to be $x_1 = x_2+x_3$ to see the counter example.

0
On

Let $x=(x_1,\ldots,X_N)$ and $y=(y_1,\ldots,y_N)$ be vectors in $\mathbb R^N$. If $Sx$ and $Sy$ are linearly independent, then $$ a\sum_{i=1}^N x_i\mathsf 1_{\{x_i\ne 0\}} + b\sum_{i=1}^Ny_i\mathsf 1_{\{y_i\ne 0\}} = 0\tag 1 $$ has only the trivial solution $a=b=0$. It follows readily that $$ a\sum_{i=1}^N x_i + b\sum_{i=1}^Ny_i = 0 $$ has only the trivial solution $a=b=0$, since this is the same as $(1)$, but with some extra zero terms on the left-hand side.