When does a matrix have an "invariant quadratic form"?

600 Views Asked by At

Yesterday I computed that the matrix

$$ A = \begin{pmatrix} 2&1\\1&1\end{pmatrix}$$

satisfies $q(m,n) = q \left((m,n)A\right)$ for the quadratic form

$$q(m,n) = m^2 - mn - n^2.$$

E.g., $-1 = q(1,1) = q(3,2) = q(8,5) =\ \ldots\ $ which is quite satisfying.

On the other hand, the matrix

$$B = \begin{pmatrix} 1&1\\1&0 \end{pmatrix}$$

fixes no such quadratic form, although it does preserve $(m,n)\mapsto (q(m,n))^2$ (since $B^2 = A$ this is maybe unsurprising).

My question. Is it known when a square matrix with integer entries preserves a non-trivial quadratic form? Moreover, when does such a quadratic form have integer coefficients?

It seems easy to verify for individual examples, but is there a general theory?

3

There are 3 best solutions below

4
On BEST ANSWER

Thanks for these answers. Prompted by the $2\times2$ and $3\times3$ case, the following is apparent.

Proposition.

$A \in \mathbb R^{n,n}$ fixes a non-trivial quadratic form if and only if

$$ 1 \in \left(\sigma(A)\right)^2 := \left\{\lambda \mu\,|\,\lambda,\mu \in \sigma(A)\right\}, $$

where $\sigma(A) = \{\lambda \in \mathbb C\,|\,\lambda\text{ is an eigenvalue of }A\}$.

I.e., either $\pm1$ is an eigenvalue of $A$, or there is some $\lambda\neq 0$ s.t. $\lambda$ and $\lambda^{-1}$ are both eigenvalues of $A$

Proof.

($1$): First, if $\lambda,\mu \in \sigma(A)$ satisfy $\lambda\mu=1$, then (letting $A^Tv = \lambda v$, $A^Tw = \lambda w$) $A$ preserves the quadratic form given by the matrix $Q = vw^T \in \mathbb R^{n,n}$ (with respect to standard basis on $\mathbb R$): $$ A^TQA = (A^Tv)(A^Tw)^T = \lambda\mu\, vw^T = vw^T = Q. $$ Moreover, $Q$ cannot be skew-symmetric, so defines a non-trivial quadratic form: the $(i,j)$th entry of $Q$, $Q_{i,j} =v_iw_j$ is non-zero for at least one $i,j$. If $i=j$ then we're done, so assume that thee exists $i\neq j$ with $v_iw_j\neq0$; and assume also that $v_jw_i + v_iw_j = 0$ (for skew-symmetry). This implies that either $v_iw_i\neq 0$ or $v_jw_j\neq 0$ by the following lemma, so we have a non-zero diagonal entry.


Lemma. If $(a,b,c,d)\in\mathbb R$ satisfy $$ ab + cd = 0, \quad ac = 0,\quad bd =0, $$ then $(a,b,c,d)=(0,0,0,0)$.

Proof of lemma:

Just multiply the first equation by $ad$ and $bc$, you get $a^2 + d^2=0$, and $b^2 + c^2 = 0$ respectively.


($2$) Conversely, suppose that $Q\in \mathbb R^{n,n}$ is symmetric and non-zero, and that $A$ preserves the quadratic form associated to $Q$. Then $$A^TQA = Q$$ (this must hold since the LHS is also symmetric).

Enumerating the elements above the diagonal of $Q$ by $(q_1,\ldots,q_{N})$, where $N = n(n+1)/2$, the previous equation is equivalent to the following. $$ B\begin{pmatrix}q_1\\\vdots\\q_N\end{pmatrix} = \begin{pmatrix}0\\\vdots\\0\end{pmatrix}, $$ where $B$ is an $N\times N$ matrix whose entries are quadratic expressions in the entries of A: $(A_{i,j})_{i,j}$. In this light, $Q$ is non-trivial is the vector $(q_1,\ldots,q_N)$ is non-zero, i.e. $B$ is singular: $$\det(B) = 0.$$ This is a degree $2N$-polynomial in $A_{i,j}$. now consider the following equation. Let $\lambda_1,\ldots, \lambda_n$ denote the eigenvalues of $A$. There exists a pair of eigenvalues which multiply to one is equivalent to the equation $$ \prod_{1\leq i\leq j\leq n }\left(\lambda_i\lambda_j - 1\right) = 0. $$ Now, the left hand side is a symmetric degree-$N$ polynomial in $(\lambda_1,\ldots, \lambda_n)$, therefore, by the fundamental theorem of symmetric polynomials, it is can be expressed in terms of the following: $$ \left(1,\sum_i \lambda_i, \ \sum_{i<j}\lambda_i\lambda_j,\ \sum_{i<j<k}\lambda_i\lambda_j\lambda_k,\ \ldots\ , \ \lambda_1\cdots\lambda_n\right); $$ That is, the equation above is a linear combination of multiples-of-polynomials from this list.) Magically, these also all appear as coefficients of the characteristic polynomial of $A$. Considering this, each polynomial in this list can be replaced by a homogeneous polynomial in $A_{i,j}$'s of the same degree.

Therefore the above degree $N$ polynomial in $\lambda_i$ can be written as a (monstrous) degree $N$ polynomial in the entries $A_{i,j}$, and this polynomial must be a factor of $\det(B)$, since by the first case it is sufficient to have some $\lambda_i\lambda_j=1$ to have $\det(B)=0$. But then they have the same degree ($N$), so $\det(B)$ cannot have any other non-scalar factors. Thus, the eigenvalue condition is actually sufficient. $\square$

Remark.

Mote that quadratic forms which are fixed by $A$ in fact form a vector space.

Extension.

From the above, my guess is that $A$ fixes a non-degenerate quadratic form if and only if $\{\lambda \in \sigma(A)\,|\,1 \in \lambda\cdot\sigma(A)\} = \sigma(A)$, i.e. all eigenvalues are involved. Equivalently, whenever $\lambda$ is an eigenvalue of $A$, $\lambda \neq 0$ and $1/\lambda$ is also an eigenvalue of $A$. Moreover, its Jordan normal form should be a diagonal matrix. Is this sufficient? I'm not sure.

14
On

I'll try to give a partial answer: when $A$ is symmetric a nondegenerate form $q$ with integer coefficients can be constructed if and only if $A$ has determinant $1$, or it is diagonal with eigenvalues $\pm 1$.

Let $Q$ be the matrix defining your quadratic form, in your example

$$Q = \begin{pmatrix} 1&-1/2\\-1/2&-1\end{pmatrix},$$

meaning that

$$q(v) = v^T Q v$$

if we write $v = \begin{pmatrix} m\\n\end{pmatrix}$.

This matrix will always be symmetric, and the quadratic form will have integer coefficients if the off-diagonal elements are half-integers, and the diagonal elements are integers.

In this notation, your question comes down to: given a matrix $A$ over the integers, can I find a matrix $Q$ such that the associated binary form $q$ has integer coefficients, and $q(Av) = q(v)$, which is the case when $A^TQA = Q$.

Note that this is a linear system in the coefficients of $Q$, hence $Q$, if it has solutions, has rational solutions, hence can be scaled to define a form with integer coefficients.

By taking determinants, you immediately see that $A$ must have determinant $\pm 1$, except possibly when $q$ is degenerate. Let's assume it is non-degenerate, so $Q$ is invertible, and we have to find out when it can hold that

$$QA^{-1}Q^{-1} = A^T$$

This says that $A^{-1}$ and $A^T$ are similar, which is a strong condition.

Let's first consider the case that $A$ is symmetric, like in your examples. Then $A$ is diagonalizable by an (orthogonal) matrix $S$: $A = S^{-1}DS = A^T$ for a diagonal matrix $D$, and we have

$$A^{-1} = S^{-1}D^{-1}S, \ \ \ \ A = A^T = S^{-1}DS,$$

The eigenvalues are of the form $\lambda, \pm\lambda^{-1}$ (because the determinant is $\pm 1$). $A^T$ has the same eigenvalues, and $A^{-1}$ has eigenvalues $\lambda^{-1}, \mu^{-1} = \pm\lambda$, again the same eigenvalues, possibly up to a sign.

Consider the case that $\det(A) = 1$. In this case either $D^{-1} = D$, or $D^{-1} = P^{-1}DP$, where $P$ is the permutation matrix

$$P = \begin{pmatrix} 0&1\\1&0\end{pmatrix}.$$

In the first case $A$ is $\pm1$ and preserves every form, in the second we can take $Q$ to be a suitable multiple of $S^{-1}PS$.

When the determinant is -1, the eigenvalues of $A$ and $A^{-1}$ being equal means that $\{\lambda, -\lambda^{-1}\}$ has to be equal to $\{\lambda^{-1}, -\lambda\}$. This is only possible when $\lambda = 1$, and then $D^{-1} = D$ and $A^{-1} = A$, and since $A\ne\pm I$, the characteristic polynomial of $A$ must be $X^2 - 1$ (by Cayley-Hamilton).

That means that $A$ must be traceless, hence of the form

$$A = \begin{pmatrix} a&b\\b&-a\end{pmatrix}$$

with $a^2 + b^2 = 1$. This leaves multiples of

$$A = \begin{pmatrix} 1&0\\0&-1\end{pmatrix}$$

and of

$$A = \begin{pmatrix} 0&1\\1&0\end{pmatrix}$$

as the only possibilities, and they should preserve the binary form given by the identity matrix, $q(m,n) = m^2 + n^2$, which indeed they do.

I'll edit later if I can think of a way to approach the general case.

1
On

There is something we can work out for determinant $-1.$ Namely, a 2 by 2 matrix of integers with determinant $-1$ is an automorphism if and only if it has trace zero.

Given $r,s,t$ integers with $r^2 + st = 1,$ but $rst \neq 0,$ we get an automorphism of $$ f(x,y) = x^2 + rtxy+ t^2 y^2 $$ with matrix identity $$ \left( \begin{array}{cc} r & s \\ t & -r \\ \end{array} \right) \left( \begin{array}{cc} 2 & rt \\ rt & 2 t^2 \\ \end{array} \right) \left( \begin{array}{cc} r & t \\ s & -r \\ \end{array} \right) = \left( \begin{array}{cc} 2 & rt \\ rt & 2 t^2 \\ \end{array} \right) $$ There are other choices. Once $r^2 + st=1,$ the remaining condition on $Ax^2 + B xy + C y^2$ is just $$ At = Br + Cs $$