Let $A$ is a nonempty set and Let V be the collection of all subsets of $A$, which is a vector space over $GF(2)$. Let $B$ be a fixed subset of $A$ and let $\alpha : V \rightarrow V$ be the endomorphism defined by $\alpha : Y \rightarrow Y\cap B.$ Find the eigenvalues of $\alpha$ and, for each eigenvalue, Find the associated eigenspace.
Could anyone help me in answering this question?
Ok, let's start with simple things.
The field $GF(2)$ is the set $\{0,1\}$ along with the addition $1+1 = 0+0= 0$ and $1+0 = 0+1 = 1$, and multiplication given by $0 \cdot 0 = 1 \cdot 0 = 0 \cdot 1 = 0$ and $1 \cdot 1 = 1$.
Now, how is $V$ a vector space over $GF(2)$? It follows from what Jean Marie mentioned : each subset of $V$ can be attached to a vector of $0$s and $1$s, given by which element lies in it and which does not. If $A$ has $n$ elements, we can identify $V$ with $GF(2)^n$.
For example, $A = \{a,b,c\}$ , then the element of $V$ corresponding to $\{a\}$ is $(1,0,0)$ because $a \in \{a\}$ but the others are not. Similarly, $\{b,c\}$ corresponds to $\{0,1,1\}$. Now, under componentwise addition, we get $(1,0,0) + (0,1,1) = (1,1,1)$ so $\{a\} + \{b,c\} = \{a,b,c\}$.
Now, it is easy to see that as set theoretic operations, the addition/scalar multiplication on $V$ is given by : $$ S + T = (S \cup T) \setminus (S \cap T) = (S \cap T^c) \cup (T \cap S^c)\\ $$ (the union in the last one is disjoint) and $cS = \phi$ if $c =0$ and $S$ if $c = 1$.
So, we have a vector space structure on $V$. (It remains to verify that the addition follows the vector space axioms, but that follows from the identification with $GF(2)^n$).
So to give an example, $\{a,b\} + \{b,c\} = \{a,c\}$ (because we remove the common element).
Now, with this, it remains to examine the given operators.
For example, let us look for fixed $B$, and then look at $\alpha(Y) = Y \cap B$. Why is this a linear transformation?
We need to check two things : one, that $\alpha(cY) = c \alpha(Y)$. This is obvious by separating into $c=0$ and $c = 1$.
The second is that $\alpha(S + T) = \alpha(S) + \alpha(T)$. $$ \alpha(S) + \alpha(T) = ((S\cap B) \cup (T\cap B)) \setminus ((S \cap B)\cap (T \cap B)) = ((S \cup T) \cap B) \setminus (S \cap T \cap B) \\ = [(S \cup T) \setminus (S \cap T)] \cap B = \alpha(S+T) $$
where the third equality can be checked in the usual way we check set containment.
Thus, we have a linear transformation. Now, only one of two eigenvalues can occur : either $c=0$ or $c= 1$ is an eigenvalue.
But eigenvalues are very easy to spot.
For example, if $\alpha(Y) = 0Y = \phi$, then $Y \cap B = \phi$. Conversely, if $Y \cap B = \phi$ then of course $\alpha(Y) = 0$. Thus, it follows that the eigenspace of $0$ is given by the subspace of $V$, generated by all subsets of $B^c$.
Similarly, suppose $\alpha(Y) = 1Y = Y$, then $Y \cap B = Y$ so $Y \subset B$. Conversely, if $Y \subset B$ then $\alpha(Y) = Y$. It follows that the eigenspace of $1$ is given by the subspace of $V$ generated by all subsets of $B$.
Take an example to clarify : Let $A = \{a,b,c\}$. The sets $\{a\},\{b\},\{c\}$ form a basis of $V$ under addition(really easy to see, because the addition becomes union for disjoint sets). so to form the matrix of $\alpha$ with say $B = \{b,c\}$ we need to look at $\alpha(\{a\}) = \phi, \alpha(\{b\}) = \{b\}, \alpha(\{c\}) = \{c\}$. Thus the matrix of $\alpha$ is : $$ \begin{pmatrix} 0 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{pmatrix} $$
You can see that our $\alpha$ is a diagonal matrix, with a $1$ if that element is in $b$, else not. It is super easy to see that $\alpha$ will always look diagonal in this kind of basis for any choice of $B,A$. Then it is very easy to see the eigenvalues/eigenspaces as well. This outlook helps in seeing $\alpha$ in a new light.