When do we have $X^p = I$, where $X$ has entries from a finite field?

65 Views Asked by At

Say I only consider $2\times 2$ matrix $X$. $X$ has entries from a finite field $\mathbb{F}_p$. When do we have $X^p = I$?

I looked into $\mathbb{Z_2}$ and found the following (excluding $I$ itself)

$$\begin{pmatrix} 1&1\\ 0&1 \end{pmatrix}, $$

What about $\mathbb{F}_p$ in general? I know that $x^p - 1 = (x-1)^p$ but is there a way to relate that to matrix?

updated: forgot to mention, but I would also like to count the number of such $2\times 2$ matrices.

3

There are 3 best solutions below

2
On

Consider an algebraic extension $E$ of $\mathbb{F}_p$ where the characteristic equation of $A$ has roots.

Since $A$ is $2 \times 2$, then $E$ is an extension of $ \mathbb{F}_p$ of degree $1$ or $2$, therefore has $p$ or $p^2$ elements.

Now, if $A$ would be diagonalisable over $E$, then $\lambda_{1,2}^{p^2-1}=1$ which means that $A^{p^2-1}=I$. But then $A^p=I$ implies that $A=I$.

Therefore, the only diagonalisable matrix which has this property is $A=I$.

Next, let us look at the non-diagonalisable possibilities.

We must have $\lambda_1=\lambda_2$, which means that for $p \neq 2$ we have $$\lambda_1=\lambda_2=2^{-1}\tr(A) \in \mathbb{F}_p$$

Then $A^p=I \Rightarrow \lambda^p=1 \Rightarrow \lambda =1$.

This shows that the only eigenvalues of $A$ must be $1$ and $1$.

Conversely, any matrix which has $1,1$ as eigenvalues satisfies $$(A-I)^2=0$$ and therefore $$(A-I)^{p}=0 \Rightarrow A^p=I$$

So the problem boils down to find for $p \neq 2$ all the $2 \times 2$ matrices with eigenvalues $1$ and $1$.

This can be found either by checking how many $2 \times 2$ matrices have Jordan form $\begin{bmatrix} 1 & 1 \\ 0&1\end{bmatrix}$ or by solving the system $$a+d=2 \\ ad-bc=1$$

2
On

If you have $X^p=I$, then all eigenvalues of $X$ are necessarily $p$-th roots of unity. But, since, as you have stated, $x^p-1=(x-1)^p$, in characteristic $p$, the only $p$-th root of unity is $1$, so the only eigenvalue is $1$, and the characteristic polynomial of $X$ is $(\lambda-1)^2$.

Since the characteristic polynomial decomposes into linear terms over ${\bf F}_p$, you can put $X$ into Jordan normal form, which (for $2\times 2$ matrices) will be either the identity or $A=\begin{pmatrix} 1&1\\ 0&1 \end{pmatrix}$ (these are the only $2\times 2$ Jordan normal forms of matrices with eigenvalues equal to $1$). You can easily check that $I^p=A^p=I$, so these are the only ones up to conjugation, so all such $X$ are either $I$ or $PAP^{-1}$ for some $P\in \operatorname{GL}_2({\bf F}_p)$.

This can be generalised into larger dimensions: for the same reasons, if $X$ is $n\times n$, the characteristic polynomial will be $(\lambda-1)^n$. It is easy to check that the Jordan blocks in the normal form will be of size up to $p\times p$ (otherwise the off-diagonal terms will not vanish when raised to $p$-th power), and any such matrix will satisfy your equation.

0
On

Sure, $X^p=\mathrm{Id} \iff (X-\mathrm{Id})^p=0$. So find solutions $Y$ to $Y^p=0$.

Now, the chain of subspaces $V,YV,Y^2V,\cdots$ must eventually be $0$, and must decrease in dimension at every step (else it would stabilizer before $0$), so either the dimensions go $2,1,0$ or they go $2,0$. The latter case occurs when $Y=0$, so suppose we're in the first case. The image will be generated by some column vector $(a,b)^T$. Then the columns of $Y$ are multiples of $(a,b)^T$, so for some scalars $u,v$ we have

$$ Y=\begin{pmatrix} ua & va \\ ub & vb \end{pmatrix}=\begin{pmatrix}a \\ b \end{pmatrix} \begin{pmatrix} u & v \end{pmatrix}. $$

Every $2\times 2$ singular matrix has this form, since determinant zeros implies the columns are linearly dependent, implies the columns are scalar multiples of each other. The condition $Y(a,b)^T=0$ is equivalent to $(a,b)\cdot(u,v)=0$, which is equivalent to $Y$ have trace equal to $0$.

This yields the following characterization: the $Y$ are precisely the $2\times 2$ matrices with both determinant and trace equal to zero.

Any two solutions $(u,v)$ to $au+bv=0$ will be related by a constant multiple (exercise), so we can say $(u,v)=c(-b,a)$ for some nonzero scalar $c$. This gives

$$Y=\begin{pmatrix} -abc & a^2c \\ -b^2c & abc\end{pmatrix}.$$

This parametrizes all possible $Y$, with some redundancy: one may scale $(a,b)$ by $r$ and $c$ by $r^{-2}$ without changing the value of $Y$. This identifies the set of all $Y$ with

$$ \big((\mathbb{F}_q^2\setminus0)\times\mathbb{F}_q\big)/\mathbb{F}_q^\times.$$

As $\mathbb{F}_q^\times$ acts regularly on $\mathbb{F}_q^2\setminus 0$, it does so on $(\mathbb{F}_q^2\setminus0)\times\mathbb{F}_q$ too, so the number of solutions is

$$ \frac{(q^2-1)q}{q-1}=q(q+1). $$