Eigenvalues of matrices over finite fields

2.2k Views Asked by At

I apologize in advance if this is trivial, but I am a bit confused here.

So consider the finite field $\mathbb{F}_{p^d}$ over the prime field $\mathbb{F}_p$. Wecan associate with every element $\alpha \in \mathbb{F}_{p^d}$, the mapping $F_{\alpha}:\mathbb{F}_{p^d} \to \mathbb{F}_{p^d}$ where for $y \in \mathbb{F}_{p^d}$, $$F_{\alpha}(y)=\alpha y$$ It is easy to show that $F_{\alpha}$ is a linear transformation, and hence an endomorphism of $\mathbb{F}_{p^d}$ as an $\mathbb{F}_{p}$-vector space. In fact, this gives us a canonical embedding of the field $\mathbb{F}_{p^d}$ in the ring $End(\mathbb{F}_{p^d})$, where $\alpha \mapsto F_{\alpha}$.

My question is: Can we conclude that since $F_{\alpha}(y)=\alpha y$ for every $y \in \mathbb{F}_{p^d}$, so $\alpha \in \bar{\mathbb{F}_p}$ is the only eigenvalue of $F_{\alpha}$ with the whole of $\mathbb{F}_{p^d}$ as its eigenspace?

I know that this is wrong, but I am unable to pinpoint the exact mistake in the conclusion. For instance, we know from the Cayley-Hamilton theorem that $F_{\alpha}$ itself (or any matrix representing it after a choice of basis) is a root of the characteristic polynomial of the endomorphism $F_{\alpha}$, and now since $F_{\alpha}$ is the image of the field element $\alpha$, this would mean that the minimal polynomial of $\alpha$ as a field element over $\mathbb{F}_p$ has to divide the characterstic polynomial of $F_{\alpha}$. This would introduce all the conjugates of $\alpha$ too as eigenvalues of $F_{\alpha}$.

But why is the trivial conclusion that every vector is an eigenvector for the eigenvalue $\alpha$ wrong? After all, $F_{\alpha}(y)=\alpha y$. This whole business of treating field elements as vectors, and then treating field multiplication as multiplication of vectors by a scalar (in the closure) is very confusing!

My ultimate aim is to establish that the characteristic polynomial of $F_{\alpha}$ is a power of the minimal polynomial of $\alpha$ as a field element. I'd be grateful if someone could give a clear picture of how to see this.

2

There are 2 best solutions below

0
On

Let's look at a tiny example. The field $K$ of 9 elements can be viewed as all $a+bi$, $a$ and $b$ coming from the field $F$ of 3 elements, $i^2=-1$. As a vector space, $K$ has a basis $\{\,1,i\,\}=\{\,(1,0),(0,1)\,\}$. Multiplication by $i$ is multiplication by the matrix $$\pmatrix{0&2\cr1&0\cr}$$ This matrix has eigenvalue $i$ with eigenvector $\pmatrix{2\cr i\cr}$, and eigenvalue $-i$ with eigenvector $\pmatrix{1\cr i\cr}$.

Maybe pondering this example will clear up your confusion.

1
On

The problem is that you implicitly change the vector space when you allow eigenvalues from an algebraic closure.

Let us go back to something that may be more familiar. Consider the complex numbers $\mathbb{C}$ as a $2$-dimensional real vectorspace.

Then $F_i$, multiplication by the complex unit $i$, corresponds to the endomorphism of $\mathbb{R}^2$ that sends $(x,y)$ to $(-y,x)$.

So the associated matrix is

$$\begin{pmatrix} 0 & -1 \\ 1 & 0\end{pmatrix}$$

The minimal polynomial is $X^2 +1$.

There is no real eigenvalue, but the complex ones $i$ and also $-i$.

Now, if one wants to compute an eigenspace associated to $i$, what would one do?

The quick answer might be one compute the kernel of $F_i - i \operatorname{id}$, that is we solve

$$\begin{pmatrix} -i & -1 \\ 1 & i\end{pmatrix}\begin{pmatrix} v_1 \\ v_2 \end{pmatrix} = 0 $$

However, what will be the domain for $v_1,v_2$? We need to allow $v_1$ and $v_2$ to be each a complex number. That is we consider the problem in the $2$-dimensional complex vector space, not the $2$-dimensional real vector space.

To sum it up, for the original vector space, the endomorphism has no eigenvalues at all. Only, when we pass to a 'larger' vector space and consider the extension of that endomorphism, then we get eigenvalues.

Also note that the endomorphism extended from "multiplication by $\alpha$" is not the same as the one obtained by scalar multiplication by $\alpha$ on the larger vector space.