Consider a $2^n \times 2^n$ binary matrix $M$, whose cells spans from $M_{0, 0}$ to $M_{2^n-1, 2^n-1}$. $M_{i, j}$ is $1$ if and only if $i \mid j = i + j$ (here $\mid$ represents the binary OR operation). That is, if you portray the indices of the cell of the matrix ($i$ and $j$) as a subset of $n$ elements, then the two sets represented by $i$ and $j$ are disjoint if and only if $M_{i, j} = 1$.
For example, when $n=2$, the matrix looks as follows: $$ \left[ \begin{array}{c|cccc} & 0 & 1 & 2 & 3\\ \hline 0 & 1 & 1 & 1 & 1\\ 1 & 1 & 0 & 1 & 0\\ 2 & 1 & 1 & 0 & 0\\ 3 & 1 & 0 & 0 & 0\\ \end{array} \right] $$ (note that the matrix is annotated by their row and index -- the actual matrix is the bottom-right $4 \times 4$ part).
It is implied in the book Communication Complexity (Kushilevitz et al) that the rank of this matrix $M$ is $2^n$. But I don't think this is trivial -- how do you prove this?
You can prove this by induction on $n$. Let $P_n$ denote the power set of $\{0, 1, \ldots, n-1 \}$.
Order the subsets lexicographically (having sorted the sets into descending order); e.g. for $n=3$, $$ \varnothing \prec \{0\} \prec \{1\} \prec \{1,0\} \prec \{2\} \prec \{2,0\} \prec \{2,1\} \prec \{2,1,0\}. $$
Notice that the first half of the sets don't contain $n-1$; hence, they form the sets $P_{n-1}$. Furthermore, there's a one-to-one correspondence between the sets in the first half of the list $P_{n-1}$ and the sets in the second half of the list $P_n \setminus P_{n-1}$ given by $$ X \;\longleftrightarrow\; X \cup \{n-1\} $$
The following claim is crucial and the proof is straightforward.
Claim: For any $X, Y \in P_{n-1}$, $$ X \cap Y = \varnothing \;\Longleftrightarrow\; X \cap (Y \cup \{n-1\}) = \varnothing. $$
The consequence of the claim is that the $2^n \times 2^n$ matrix $M_n$ indicating disjointness in $P_n$ is actually a $2 \times 2$ block matrix, with $2^{n-1} \times 2^{n-1}$ equal sized blocks, three of which are identical. $$ M_n = \left[ \begin{array}{c|c} M_{n-1} & M_{n-1} \\ \hline M_{n-1} & 0 \end{array} \right] $$
By the inductive hypothesis, we can assume that the columns of $M_{n-1}$ are linearly independent (as $M_{n-1}$ has full rank). Now, suppose that we had a linear relation among the columns of $M_n$. Because of the zeroes in the bottom right, we would have a linear relation among the left half of the columns, but that would amount to two copies each of linear relations among the half-columns making up $M_{n-1}$. Only the trivial relation exists by hypothesis, so may conclude that $M_n$ has full rank, too.
By the way, the matrix is a cropped piece of Sierpinski's Triangle, which may be seen by interpreting the $(i,j)$-entry of $M_n$ as $$ \binom{i+j}{i} \bmod{2}. $$