Let $S:\mathbb{R}^N\to\{0,1\}^N$ be defined as $$S_i(x)=\begin{cases} 1&\text{ if }x_i\neq0\\ 0&\text{ otherwise}. \end{cases}$$ If $S(x_1)$and $S(x_2)$ are linearly independent, is it guaranteed that $x_1$ and $x_2$ are also linearly independent? How do we extend this to $N$ vectors?
I can intuitively see this for small vectors. I wrote a small Matlab code to check this for $N=5,6,\dots,10$, and it works.
count = 0;
N=100;
m=25;
for i=1:10000
S=double(randn(N,m)>0);
X=randn(N,m);
D=S.*X;
if rank(D)<rank(S)
count=count+1;
end
end
count
How do we formally prove the result?
I interpret the question as if $\{ S(x_1), S(x_2)\}$ is linearly independent, then is it true that $\{x_1, x_2\}$ linearly independent?
Suppose $c_1 x_1 + c_2 x_2=0$.
If $c_1=0, c_2 \ne 0$, then $x_2=0$, and hence $S(x_2)=0$ and we can't have $\{S(x_1), S(x_2)\}$ linearly independent. Similarly if $c_1\ne 0, c_2=0$.
Suppose that $c_1 \ne 0, c_2 \ne 0$, then we have $x_1 = \frac{c_2x_2}{c_1}$, then we have $x_{1j}=0 \iff x_{2j}=0$ and hence we must have $S(x_1)=S(x_2)$ which again contradicts that $\{S(x_1), S(x_2)\}$ are linearly independent.
Hence we must have $(c_1, c_2)=(0,0)$, that is $\{x_1, x_2\}$ must be linearly independent.
Edit:
The result is not true for $3$ vectors.
Consider $x_1 =\begin{bmatrix} 1 \\ 0 \\ -1\end{bmatrix}, x_2 =\begin{bmatrix} 1 \\ 1 \\ 0\end{bmatrix}, x_3 =\begin{bmatrix} 0 \\1 \\ 1\end{bmatrix}$, they are linearly dependent since $x_1=x_2-x_3$.
However $\{S(x_1), S(x_2), S(x_3)\}$ is linearly independent.
Edit 2:
It is not true even if we impose the condition that $x_i \ge 0$. Keeping my $x_2, x_3$, just change the $x_1$ in the previous counter example to be $x_1 = x_2+x_3$ to see the counter example.