Yesterday I computed that the matrix
$$ A = \begin{pmatrix} 2&1\\1&1\end{pmatrix}$$
satisfies $q(m,n) = q \left((m,n)A\right)$ for the quadratic form
$$q(m,n) = m^2 - mn - n^2.$$
E.g., $-1 = q(1,1) = q(3,2) = q(8,5) =\ \ldots\ $ which is quite satisfying.
On the other hand, the matrix
$$B = \begin{pmatrix} 1&1\\1&0 \end{pmatrix}$$
fixes no such quadratic form, although it does preserve $(m,n)\mapsto (q(m,n))^2$ (since $B^2 = A$ this is maybe unsurprising).
My question. Is it known when a square matrix with integer entries preserves a non-trivial quadratic form? Moreover, when does such a quadratic form have integer coefficients?
It seems easy to verify for individual examples, but is there a general theory?
Thanks for these answers. Prompted by the $2\times2$ and $3\times3$ case, the following is apparent.
Proposition.
$A \in \mathbb R^{n,n}$ fixes a non-trivial quadratic form if and only if
$$ 1 \in \left(\sigma(A)\right)^2 := \left\{\lambda \mu\,|\,\lambda,\mu \in \sigma(A)\right\}, $$
where $\sigma(A) = \{\lambda \in \mathbb C\,|\,\lambda\text{ is an eigenvalue of }A\}$.
I.e., either $\pm1$ is an eigenvalue of $A$, or there is some $\lambda\neq 0$ s.t. $\lambda$ and $\lambda^{-1}$ are both eigenvalues of $A$
Proof.
($1$): First, if $\lambda,\mu \in \sigma(A)$ satisfy $\lambda\mu=1$, then (letting $A^Tv = \lambda v$, $A^Tw = \lambda w$) $A$ preserves the quadratic form given by the matrix $Q = vw^T \in \mathbb R^{n,n}$ (with respect to standard basis on $\mathbb R$): $$ A^TQA = (A^Tv)(A^Tw)^T = \lambda\mu\, vw^T = vw^T = Q. $$ Moreover, $Q$ cannot be skew-symmetric, so defines a non-trivial quadratic form: the $(i,j)$th entry of $Q$, $Q_{i,j} =v_iw_j$ is non-zero for at least one $i,j$. If $i=j$ then we're done, so assume that thee exists $i\neq j$ with $v_iw_j\neq0$; and assume also that $v_jw_i + v_iw_j = 0$ (for skew-symmetry). This implies that either $v_iw_i\neq 0$ or $v_jw_j\neq 0$ by the following lemma, so we have a non-zero diagonal entry.
($2$) Conversely, suppose that $Q\in \mathbb R^{n,n}$ is symmetric and non-zero, and that $A$ preserves the quadratic form associated to $Q$. Then $$A^TQA = Q$$ (this must hold since the LHS is also symmetric).
Enumerating the elements above the diagonal of $Q$ by $(q_1,\ldots,q_{N})$, where $N = n(n+1)/2$, the previous equation is equivalent to the following. $$ B\begin{pmatrix}q_1\\\vdots\\q_N\end{pmatrix} = \begin{pmatrix}0\\\vdots\\0\end{pmatrix}, $$ where $B$ is an $N\times N$ matrix whose entries are quadratic expressions in the entries of A: $(A_{i,j})_{i,j}$. In this light, $Q$ is non-trivial is the vector $(q_1,\ldots,q_N)$ is non-zero, i.e. $B$ is singular: $$\det(B) = 0.$$ This is a degree $2N$-polynomial in $A_{i,j}$. now consider the following equation. Let $\lambda_1,\ldots, \lambda_n$ denote the eigenvalues of $A$. There exists a pair of eigenvalues which multiply to one is equivalent to the equation $$ \prod_{1\leq i\leq j\leq n }\left(\lambda_i\lambda_j - 1\right) = 0. $$ Now, the left hand side is a symmetric degree-$N$ polynomial in $(\lambda_1,\ldots, \lambda_n)$, therefore, by the fundamental theorem of symmetric polynomials, it is can be expressed in terms of the following: $$ \left(1,\sum_i \lambda_i, \ \sum_{i<j}\lambda_i\lambda_j,\ \sum_{i<j<k}\lambda_i\lambda_j\lambda_k,\ \ldots\ , \ \lambda_1\cdots\lambda_n\right); $$ That is, the equation above is a linear combination of multiples-of-polynomials from this list.) Magically, these also all appear as coefficients of the characteristic polynomial of $A$. Considering this, each polynomial in this list can be replaced by a homogeneous polynomial in $A_{i,j}$'s of the same degree.
Therefore the above degree $N$ polynomial in $\lambda_i$ can be written as a (monstrous) degree $N$ polynomial in the entries $A_{i,j}$, and this polynomial must be a factor of $\det(B)$, since by the first case it is sufficient to have some $\lambda_i\lambda_j=1$ to have $\det(B)=0$. But then they have the same degree ($N$), so $\det(B)$ cannot have any other non-scalar factors. Thus, the eigenvalue condition is actually sufficient. $\square$
Remark.
Mote that quadratic forms which are fixed by $A$ in fact form a vector space.
Extension.
From the above, my guess is that $A$ fixes a non-degenerate quadratic form if and only if $\{\lambda \in \sigma(A)\,|\,1 \in \lambda\cdot\sigma(A)\} = \sigma(A)$, i.e. all eigenvalues are involved. Equivalently, whenever $\lambda$ is an eigenvalue of $A$, $\lambda \neq 0$ and $1/\lambda$ is also an eigenvalue of $A$. Moreover, its Jordan normal form should be a diagonal matrix. Is this sufficient? I'm not sure.