I'm trying to prove that if random variables $X_1,..., X_n$ are not almost constant and pairwise independent, then they are linearly independent.
I tried to do it by contradiction, suppose WLOG that $c_1 \neq 0$, then $X_1 = \frac{\sum_{i>1} c_iX_i}{-c_1}$, and also one of the other $c_i \neq 0$, because otherwise $P(X_1=0) = 1$, so we get that $X_1$ is dependent on the combination other variables. But I don't know how to get from that to showing that it's not independent with some particular $X_i$. I thought about fixing all $X_i$ other than $X_1$ and the other one with $c_i$, then we get that $X_1$ is conditionally dependent on $X_i$, and that's for all choices of value of other variables. But I don't think that implies that $X_1, X_i$ are not independent.