Square matrix with rational coefficients having $k$-th root

742 Views Asked by At

Let $A\in M_{n}(\mathbb{Q})$, meaning that $A$ is $n\times n$ matrix with entries in the rational numbers $\mathbb{Q}$. Suppose that $A$ satisfies two conditions:

  • $\det(A)\neq 0$
  • For every integer $k$, there exists a $B\in M_{n}(\mathbb{Q})$ such that $B^k = A$.

Does it follow that $A = I_n$? Here, $I_n$ is the $n\times n$ identity matrix.

Motivation. The analogous problem where $\mathbb{Q}$ everywhere is replaced by $\mathbb{Z}$ has an affirmative answer. More precisely, let's prove the following statement:

Claim. Let $A\in M_{n}(\mathbb{Z})$, meaning that $A$ is $n\times n$ matrix with integer entries. Suppose that $\det(A)\neq 0$ and that for every integer $k$, there exists $B\in M_{n}(\mathbb{Z})$ such that $B^k = A$. Then $A = I_n$.

Proof. Since $\det(A)\neq 0$, there are only finitely many primes dividing $\det(A)$. Let $p$ be any prime which does not divide $\det(A)$. After reducing mod $p$, we still get a non-singular matrix $\overline{A}$, so $\overline{A}\in\operatorname{GL}_{n}(\mathbb{Z}/p\mathbb{Z})$. Let $k=\# \operatorname{GL}_{n}(\mathbb{Z}/p\mathbb{Z})$ be the cardinality of the group of invertible matrices with entries in $\mathbb{Z}/p\mathbb{Z}$. By hypothesis, there exists $B\in M_{n}(\mathbb{Z})$ such that $B^k=A$. After reducing mod $p$, we get $(\overline{B})^k = \overline{A}$. By Lagrange's theorem, $(\overline{B})^k = \overline{I_{n}}$. Therefore, $\overline{A} = \overline{I_{n}}$, so that each entry of $A-I_{n}$ is divisible by $p$. Since we have infinitely many choices for the prime $p$, it follows that each entry of $A-I_{n}$ must in fact be zero, yielding $A=I_n$.

Remark 1. The hypothesis $\det(A)\neq 0$ is necessary. Otherwise, we can just take any diagonal matrix $A= \operatorname{diag}(1, 1, .. 1, 0, 0 .., 0)$ with any number of $1$s and $0$s. Certainly $A^k = A$ for every $k$ but $A\neq I_{n}$ if there is at least one zero on the diagonal.

Remark 2. It is tempting to modify the above argument for the case of $\mathbb{Q}$, but it doesn't seem to work immediately. Again using the fact that $\det(A)\neq 0$, there are only finitely many primes appearing in the numerator and denominator of $\det(A)$, so we can pick a prime $p$ which doesn't appear there. Reducing the original matrix $A$ mod $p$ still works fine. But the problem arises at the step when we use hypothesis to get a matrix $B$ such that $B^k = A$ (where $k$ clearly depends on p -- this is important). That matrix $B$ might have the prime $p$ show up in the denominator of some of its entries, so reduction mod $p$ doesn't make sense.

Reference. I learnt the proof above from "Mathematical Bridges" by Andreescu, Mortici and Tetiva. The problem appears as Exercise 18 in Chapter 6, and the solution is presented few pages later. So I am also adding the (contest-math) tag, since that is the theme of the aforementioned book.

2

There are 2 best solutions below

0
On BEST ANSWER

The general result is as follows:

Let $A\in M_n(\mathbb{Q})$; then

For every integer $k$, there exists $B\in M_n(\mathbb{Q})$ such that $B^k=A$ IFF $A(A-I_n)^n=0$.

You can see several proofs

in english in https://artofproblemsolving.com/community/c7h42444 , grobber's post.

in french in "La revue de mathématiques spéciales" n° 117-2 , R535, where, with a colleague, we gave a solution using Mahler's measure.

2
On

For convenience, we call a matrix $A$ with solutions $A=B^k$ for all $k\geq 1$ exceptional. Notice that $$ \det(A) = \det(B)^k \implies \sqrt[k]{\det(A)} = \det(B) \in \mathbb Q $$ This is only possible if $\det(A)=1$, so we assume that.

Overview

Let $T_n(b)$ denote an $n$ by $n$ matrix with all entries zero except for top right entry $b$. Then for each $n\geq 2$ there is an infinite family of exceptional matrices in $M_n(\mathbb Q)$, of the form $$ A = S(I_n+T_n(b))S^{-1} $$ where $S$ can be any invertible matrix in $M_n(\mathbb Q)$. The solutions $B_k$ are of the form $$ B_k = S(I_n + T_n(b/k))S^{-1} $$ When $b\neq 0$, we have $A\neq I_n$, hence an infinite family of counterexamples and answering the original question. We remark that this family is unipotent.

For dimension $n=2$, we give another infinite family of exceptional matrices $$ A = S\begin{pmatrix}2 & 1\\ -1 & 0\end{pmatrix} S^{-1} $$ where $S$ can any invertible matrix in $M_2(\mathbb Q)$. It has the solutions $$ B_k = S \begin{pmatrix} \frac{k+1}{k} & \frac{1}{k}\\ \frac{-1}{k} & \frac{k-1}{k} \end{pmatrix}S^{-1} $$ We remark that this family is also unipotent.

In addition, for $n=2$ we prove that the above two are the only types possible. By applying suitable conjugation matrices, we get a reduced form where we prove that it is either of the given forms above or $A$ will fail to remain in $M_2(\mathbb Q)$ upon repeated squareroots (equivalently $A=B^{2^k}$ has no solutions for some large enough $k$), or fail $A=B^3$ for one particular case.


A. Reduction of $M_2(\mathbb Q)$ via conjugation

Let $S\in M_n(\mathbb Q)$ with $\det(S)\neq 0$, so that $S$ is invertible and $S^{-1}\in M_n(\mathbb Q)$. Then for all $k\geq 1$, $$ SAS^{-1} = S(B_k)^kS^{-1} = (SB_kS^{-1})^k $$ Therefore $A' = SAS^{-1}$ is also exceptional with solutions $SB_kS^{-1}$ for each $k\geq 1$.

Using the conjugation argument, we will show that any exceptional matrix $A= \begin{pmatrix} a & b \\ c & d \end{pmatrix}\in M_2(\mathbb Q)$ can be transformed to one of the $2$ types of (reduced) exceptional matrix: $$ \begin{pmatrix} a & b \\ 0 & 1/a \end{pmatrix}, \begin{pmatrix} a & 1 \\ -1 & 0 \end{pmatrix} $$ for $c=0$ and $c\neq 0$ respectively.

For the former case, if $a=1$ then we have the construction given at the beginning of this answer. For the latter case, if $a=2$ then we have the other construction given at the start.

For all the rest of the cases, we show that we cannot take squareroots infinitely, hence contradicting that $A$ is exceptional. (For one particular case it fails due to no solutions to $A=B^3$.)

Therefore any exceptional matrix is (conjugate)-reduced to either $A_1=\begin{pmatrix} 1 & b\\ 0 & 1\end{pmatrix}$ or $A_2=\begin{pmatrix} 2 & 1\\ -1 & 0\end{pmatrix}$. This in turn means that all exceptional matrices are generated via $SA_1S^{-1}$ or $SA_2S^{-1}$.


B. Solving repeated squareroots

We want to analyze repeated squareroots starting from $A$, which turns out to be enough to classify the matrices for $n=2$. Consider a solution to $$ R^{2^k} = A $$ for some $k\geq 1$ (we will later let $k\rightarrow \infty$). Define $R_j=R^{2^{k-j}}$, so that $$ R_{j}= (R_{j+1})^2 $$ Then we have a series of equations $$ \begin{align*} A &= R_{0}=(R_1)^2\\ R_{1} &= (R_{2})^2\\ R_{j} &= (R_{j+1})^2 \end{align*} $$ Since $\det(A)=1$, each successive squareroot must have $\det(R_j) = \pm 1$. However since each of them (except the last, $R_k=R$) is a square, it must be $\det(R_j)=1$.

Let $tr(\cdot)$ be the matrix-trace function. The key result we want to derive is that

Proposition. Let the system of equations $$ \begin{align*} A &= R_{0}=(R_1)^2\\ R_{1} &= (R_{2})^2\\ R_{j} &= (R_{j+1})^2 \end{align*} $$ have a solution for $\det(R_j)=1$ and $A,R_j\in M_2(\mathbb Q)$, as $0\leq j\rightarrow \infty$. Then $tr(R_j)$ must be integral for each $0\leq j < k$. Moreover, the $tr(R_j)$'s satisfy the recurrence $$ tr(R_{j+1})^2 = tr(R_j) + 2 $$

Proof. By the Cayley-Hamilton theorem in dimension $n=2$, for any $R_i$ we have $$ (R_i)^2 - tr(R_i)R_i + \det(R_i)I = 0 $$ Therefore a necessary condition for $R_j = (R_{j+1})^2$, together with $\det(R_{j+1})=1$, is $$ \begin{align*} R_j - tr(R_{j+1})R_{j+1} + I &= 0\\ tr(R_{j+1})R_{j+1} &= R_j + I \end{align*} $$ Let $R_j = \begin{pmatrix} a & b \\ c & d\end{pmatrix}$ and taking determinant, $$ \begin{align*} tr(R_{j+1})^2\det(R_{j+1}) &= \det\begin{pmatrix} a+1 & b \\ c & d+1\end{pmatrix} \\ tr(R_{j+1})^2 &= (a+1)(d+1) - bc \\ &= a+d+1 + (ad-bc)\\ &= a+d+2 \\ &= tr(R_j)+2 \end{align*} $$ Therefore we have a recurrence relation $$ tr(R_{j+1})^2 = tr(R_j) + 2 $$ Note that $R_j\in M_2(\mathbb Q)$ so that $tr(R_j)\in \mathbb Q$.

Suppose that $tr(R_j)\not\in\mathbb Z$ for some $j$. We write $tr(R_j) = u_j/v_j$ for some $u_j,v_j\in \mathbb Z$ and $\gcd(u_j,v_j)=1$. Then $v_j\geq 2$. Now $$ tr(R_{j+1})^2 = tr(R_j)+2 = \frac{u_j+2v_j}{v_j} $$ Since $\gcd(u_j+2v_j,v_j) = \gcd(u_j,v_j) = 1$, no further cancellation is possible and we must have $$ tr(R_{j+1}) = \pm\frac{\sqrt{u_j+2v_j}}{\sqrt{v_j}} = \frac{u_{j+1}}{v_{j+1}} $$ If $\sqrt{v_j} \not \in \mathbb Z$, then $tr(R_{j+1})\not\in\mathbb Q$, contradicting $R_{j+1}\in M_2(\mathbb Q)$. So we assume $v_{j+1} = \sqrt{v_j}\in\mathbb Z$. Likewise, we must also have $u_{j+1} = \pm \sqrt{u_j+2v_j} \in\mathbb Z$.

Since we simply took a squareroot resulting in a rational, it must be the case that $\gcd(u_{j+1},v_{j+1}) = 1$, so we can reapply the same arguments for $tr(R_{j+2}) = u_{j+2}/v_{j+2}$. i.e. $$u_{j+2} = \pm \sqrt{u_{j+1} + 2v_{j+1}} \in \mathbb Z, v_{j+2} = \sqrt{v_{j+1}} \in \mathbb Z$$

Hence by induction we can see that the denominator is squareroot-ed every iteration. If $R_j\not \in \mathbb Z$ for some $j$, then this iterative squareroot process must result in an irrational denominator eventually, i.e. $tr(R_m) \not\in \mathbb Q$ for some large enough $m>j$. This contradicts $R_m\in M_2(\mathbb Q)$, therefore it must have been the case that $R_j\in \mathbb Z$ for all $j\geq 0$. $$\tag*{$\square$}$$


Let $A=\begin{pmatrix} a & b \\ c & d\end{pmatrix}$ be exceptional. We want to classify $A$ using the conjugation and repeated squareroots arguments established earlier.

C1. Case $n=2$ and $c=0$

First, we look at the case $c=0$. Since $\det(A)=1$, we get $d=1/a$, giving us
$$ A= \begin{pmatrix} a & b \\ 0 & 1/a \end{pmatrix} $$ Since $A$ is exceptional, we can take squareroots infinitely. By the proposition in B., we must have $tr(A) \in \mathbb Z$, say $tr(A) = t\in\mathbb Z$.

Write $a = u/v$ with $u,v\in\mathbb Z$ and $\gcd(u,v)=1$. Then $$ \begin{align*} \frac{u^2+v^2}{uv} &= a + \frac{1}{a} = t\in \mathbb Z\\ u^2+v^2 &= tuv \end{align*} $$ This shows that $u$ divides $v$ and vice-versa, which is only possible if $u,v = \pm 1$. Therefore $a=1$ or $a= -1$. If $a=-1$, then $tr(A) = -2$. Taking the first squareroot as $R_1$, the formula gives $$ tr(R_1)^2 = tr(A) +2 = 0 \implies tr(R_1) = 0 $$ But now for the second squareroot $R_2$ we have $$ tr(R_2)^2 = tr(R_1)+2 = 2 \implies tr(R_2) = \pm \sqrt{2}, $$ contradicting $tr(R_2)\in\mathbb Q$. Therefore the only possibility is $a=1$. We have seen that this results in the family of solution given at the start of this answer.


C2. Case $n=2$ and $c\neq 0$

Choosing the conjugation matrices $$ S= \begin{pmatrix} 0 & 1 \\ c & -a \end{pmatrix}, S^{-1} = \begin{pmatrix} a/c & 1/c \\ 1 & 0 \end{pmatrix}, $$ we get a new exceptional matrix $A'$ $$ A'= SAS^{-1} = \begin{pmatrix} a+d & 1 \\ bc-ad & 0 \end{pmatrix} = \begin{pmatrix} a+d & 1 \\ -1 & 0 \end{pmatrix} $$ where the last equality is due to the condition $\det(A)=1$.

Denote $r_0 = tr(A) = a+d$ and similarly let $r_j = tr(R_j)$, where $R_j$'s are the matrices obtained by repeated squareroots as in B.. By proposition B. again, we must have $r_j\in\mathbb Z$ and satisfy the recurrence $$ r_{j+1}^2 = r_j + 2 \implies r_{j+1} = \pm \sqrt{r_j + 2} $$ Since $r_{j+1}\in\mathbb Z$ clearly $r_j\geq -2$ for each $j$.

For $r_0=2$, corresponding to $A=\begin{pmatrix} 2 & 1\\ -1 & 0\end{pmatrix}$, we have seen at the start of the answer that it is an exceptional matrix.

For $r_0 = -2,0,1$, we get $r_1 = 0, \pm \sqrt{2}, \pm \sqrt{3}$ respectively. $r_1=0$ in turn gives $r_2 = \pm \sqrt{2}$, so all of them contradicts $r_j\in\mathbb Z$.

For $r_0 > 2$, then $r_1^2 > 4$. Since $r_j > -2$ we must have $r_1>2$ again, so by induction $r_j > 2$ for all $j$. The sequence of $r_{j+1} = \sqrt{r_j+2}$ becomes smaller than $3$ for some sufficiently large $m$, then $2 < r_m < 3$ results in $$ r_{m+1}^2 = r_m + 2 \implies 4 < r_{m+1}^2 < 5 $$ which contradicts the fact that $r_{m+1}\in\mathbb Z$.

Since $r_0\in\mathbb Z$, this leaves the final case of $r_0 = -1$. For this case we use a numeric solver to show directly that there are no solutions to $$ B^3 = \begin{pmatrix} -1 & 1\\ -1 & 0 \end{pmatrix} $$ Edit 1: Algebraic proof
Let $B$ be a solution to $A=B^3$ and $t=tr(B)$. Then $$\det(B)^3 = \det(A) = 1$$ Since $\det(B)\in\mathbb Q$ we conclude $\det(B)=1$. By Cayley-Hamilton: $$ \begin{align*} B^2-tB+I &= 0\\ B^3-tB^2+B &= 0\\ A-t(tB-I)+B &= 0\\ (t^2-1)B &= A+tI = \begin{pmatrix} -1+t & 1\\ -1 & t\end{pmatrix}\\ (t^2-1)^2 \det(B) &= \det(A+tI) = (t-1)t + 1\\ t^4-2t^2+1 &= t^2-t+1\\ t(t^3-3t+ 1) &= 0 \end{align*} $$ Since $t^3-3t+1$ has no solutions $\pmod 2$ it is irreducible in $\mathbb Z$ and hence also irreducible in $\mathbb Q$. Since $t\in \mathbb Q$, the only possibility is $t=0$. Therefore $-B = A\implies B = -A$. However we can check manually that $B^3 = -A^3 = -I \neq A$, hence there cannot be a solution $B$.
End edit 1

Hence, going through all integer possibilities of $r_0=a+d$, we see that only $r_0=2$ results in an exceptional matrix.

This completes all the cases.