Let $M$ be a boolean matrix, and let $M'$ be the matrix formed by replacing all the 0's of $M$ with $-1$. If $r = rank(M)$, then why is $r-1 \leq rank(M') \leq r$?
I saw this result in a paper I was reading, and the author wrote that it was obvious. What I know is that if you have two boolean vectors $v, w$ where $v \cdot w = 0$, then $v' = -w'$, where $v'$ and $w'$ replace the 0 entries with -1. This means that 'complementary' boolean vectors will become scalar multiples under this transformation, shrinking the rank by 1. It is also easy to show that you cannot have two pairs of complementary boolean vectors in a linearly independent set.
Given this, I've tried to prove that given a set of linearly dependent $\pm 1$ vectors, show there exists two vectors $v, w$ such that $v = -w$. I haven't been able to prove this either though.
Is this the right approach? Is there some fact about rank that does indeed make this obvious?
This "obvious" result is incorrect as stated. In general, we find that $$ r - 1 \leq \operatorname{rank}(M') \leq r + 1. $$ To see that this is the case, note that we can write $$ M' = 2M - J, $$ where $J$ is the matrix with the same shape as $M$ whose entries are all $1$. Notably, $J$ is a rank-$1$ matrix. As such, applying the general inequality $\operatorname{rank}(A + B) \leq \operatorname{rank}(A) + \operatorname{rank}(B)$ leads to $$ \operatorname{rank}(2M - J) \leq \operatorname{rank}(2M) + \operatorname{rank}(-J) = r + 1\\ \operatorname{rank}(2M) \leq \operatorname{rank}(2M - J) + \operatorname{rank}(J) \implies \operatorname{rank}(2M - J) \geq r-1. $$ As examples where equality is attained for each bound, consider $$ M = \pmatrix{1&0\\0&1} \implies M' = \pmatrix{1&-1\\-1&1}, \quad \operatorname{rank}(M') = \operatorname{rank}(M) - 1\\ M = \pmatrix{1&0\\0&0} \implies M' = \pmatrix{1&-1\\-1&-1}, \quad \operatorname{rank}(M') = \operatorname{rank}(M) + 1. $$