Linear independence of indicator functions

682 Views Asked by At

Question

Let $X$ be a finite set, let $\mathbf{F}$ be a field. Let $\mathcal{A}$ be a subset of the power set of $X$, i.e. $\mathcal{A}\subseteq\mathcal{P}(X)$. Let $1_A:X\to F$ denote the indicator function of $A\subseteq X$. When is $\{1_A:A\in\mathcal{A}\}$ linearly independent in the vector space $\mathbf{F}^X$?

Potential difficulty

Unfortunately, this seems to depend on the characteristic of $\mathbf{F}$. For example, if $X=\{a,b,c\}$ and $\mathcal{A}$ consists of all two-element subsets of $X$, then $\{1_A:A\in\mathcal{A}\}$ is dependent when $\mathrm{char}(\mathbf{F})=2$, but independent otherwise.

Partial answer

All I can say so far is that $\mathcal{A}$ cannot contain an inclusion-exclusion family by which I mean a collection of the form $$\{A,B,A\cap B, A\cup B\}\text{ or}$$ $$\{A,B,C,A\cap B, A\cap C, B\cap C, A\cap B\cap C, A\cup B \cup C\} \text{ or so on}.$$

1

There are 1 best solutions below

2
On

First, notice that the indicator function of singleton sets always forms a basis for the space, (e.g. $1_{\{a\}},1_{\{b\}},1_{\{c\}}$ is a basis for $F^X$ (regardless of $F$)). Now saw that $X = \{x_1,\dots,x_n\}$.

For a function $f \in F^X$ then we can think of it as a column vector $\begin{bmatrix}\ f(x_1)\\ \vdots \\ f(x_n) \end{bmatrix}$. In this case, if $e_i$ is the $\text{i}^{\text{th}}$ standard basis vector for $F^n$, then each $e_i$ will correspond to $1_{x_i}$.

Using this basis, a set of indicator functions will be linearly independent whenever the column vectors are linearly independent.

So, we can put these into a matrix and row reduce to test for independence. Taking your example, the corresponding column vectors will be $ \begin{bmatrix} 1 \\ 1 \\ 0 \end{bmatrix}, \begin{bmatrix} 1 \\ 0 \\ 1 \end{bmatrix}$ and $ \begin{bmatrix} 0 \\ 1 \\ 1 \end{bmatrix}$. We can put these into a matrix to get and reduce only working with integers (Don't multiply row by non-unit constant however). In this case, we get $\begin{bmatrix} 1 & 1 & 0 \\ 1 & 0 & 1 \\ 0 & 1 & 1 \end{bmatrix}$ which we can reduce to $\begin{bmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 2 \end{bmatrix}$. We then know that in characteristic $2$, the third entry actually became $0$ and therefore the vectors are not linearly independent but in other characteristics, they are linearly independent.