Identities about matrices that are useful in contests / olympiads

431 Views Asked by At

I would love for students ( such as myself ) that want to participate in mathematics competitions or olympiads to have a place from where they could gather theorems, identities or maybe instructive problems that could come in handy in such a setting. That is why I decided to make this thread in which people would add up on the vast collective knowledge gathered ( hopefully ). I intend to make this thread about matrices / linear algebra. Here is my contribution:

The Determinant of a matrix:

Let $A$ be a $n \times n$ matrix and $a_{i,j}$ its elements. The determinant of $A$ is defined to be:

  1. $\operatorname{det}(A) = \sum\limits_{\sigma\in S_n}\operatorname{sgn}(\sigma)\prod\limits_{i=1}^n a_{i,\sigma(i)}.$

This is the Leibniz formula. It can also be defined by the Laplace formula:

  1. $\operatorname{det}(A) = \sum\limits_{i=1}^n a_{i,j} M_{i,j}(-1)^{i+j}$ $\forall j \in \{1,2,\dots,n\} $ where $M_{i,j}$ is the $(i,j)$minor of $A$.

Given these definitions one can deduce that:

  1. $\operatorname{det}(AB) = \operatorname{det}(A)\operatorname{det}(B)$

  2. $\operatorname{det}(xA) = x^n \operatorname{det}(A)$, where $x\in\mathbb{C}$

  3. $\operatorname{det}(A^T)=\operatorname{det}(A)$, where $A^T$ is the transpose of $A$.

  4. $\operatorname{det}(A^{-1}) = \frac{1}{\operatorname{det}(A)}$

  5. If $A$ is a triangular matrix then $\operatorname{det}(A) = \prod\limits_{i=1}^n a_{i,i} = a_{1,1}a_{2,2}\dots a_{n,n}$

The adjugate matrix:

The adjugate matrix $\operatorname{adj}(A) = (M_{j,i}(-1)^{i+j})_{1\leq i,j \leq n}$. Given this definition we can show that:

  1. $\operatorname{adj}(A)A = A\operatorname{adj}(A) = \operatorname{det}(A)I_n$

Proof: The element of $\operatorname{adj}(A)A$ on the position $(i^{'},j{'})$ is the sum $\sum\limits_{k=1}^n a_{i{'},k} M_{k,j{'}}(-1)^{k+j{'}}$. If $i^{'}=j^{'}$ then the sum is the determinant of $A$ otherwise it is the determinant of a matrix obtained from $A$ by replacing the $j{'}$-th line with the $i{'}$-th line therefore it is $0$. This conludes the proof.

From this we can conclude that:

  1. If $\operatorname{det}(A) \neq 0$ then $A^{-1} = \frac{1}{\operatorname{det}(A)}\operatorname{adj}(A)$

  2. $\operatorname{det}(\operatorname{adj}(A)) = \operatorname{det}(A)^{n-1}$

  3. If $\operatorname{rank}(A) \leq n-2$ then $\operatorname{adj}(A) = O_n$ (because $M_{i,j} = 0$)

The characteristic polynomial of a matrix, Eigenvalues and Eigenvectors:

Given a $n \times n$ matrix $A$ consider the following equality:

$$Av = \lambda v$$ ,where $\lambda$ is a constant and $v$ is a non-zero column matrix. If this equality hold we call $\lambda$ an eigenvalue and $v$ an eigenvector of $A$. The equality can be written equivalently: $$(\lambda I-A)v = 0 $$ It holds only if $\operatorname{det}(\lambda I-A) = 0$. The eigenvalues of $A$ are, therefore, roots of the polynomial $ p_A(\lambda) = \operatorname{det}(\lambda I-A)$, which is called the characteristic polynomial of the matrix $A$. This polynomial has degree $n$.

If we denote the eigenvalues of $A$ by $\lambda_1, \lambda_2, \dots \lambda_n$ then the coefficients of this polynomial are the elementary symmetric polynomials in $\lambda_1, \lambda_2, \dots \lambda_n$.

Using the eigenvalues we can express the determinant and trace as:

$$\operatorname{tr}A = \lambda_1+ \lambda_2+ \dots+ \lambda_n$$ $$\operatorname{det}A = \lambda_1 \lambda_2 \dots \lambda_n$$

The theorem Cayley-Hamilton states that: $ p_A(A) = 0 $. For a $2\times 2$ matrix this leads to: $A^2 = \operatorname{tr}(A) A - \operatorname{det}(A) I_2$

Nilpotent Matrices:

An $n \times n$ matrix $A$ is called nilpotent if there exist a $k \in \mathbb{N}$ such that $A^k = O_n$

  1. $A$ is nilpotent if and only if all its eigenvalues are zero.

Proof : If $A$ is nilpotent then there exist $k$ such that $A^k = 0_n$. So the eigenvalues of $A^k$ are equal to zero. $\lambda_i^k = 0 \iff \lambda_i = 0$.

If $\lambda_i = 0$ then the characteristic polynomial is $x^n$ so, by Cayley-Hamilton $A^n = 0_n$

From this the following can easily be proven:

  1. $A$ is nilpotent if and only if $A^n = 0_n$

  2. $A$ is nilpotent if and only the characteristic polynomial of $A$ is $x^n$.

    A more interesting result is:

  3. $A$ is nilpotent if and only $\operatorname{tr}(A^k) = 0$ for all $k \in \{ 1,2, \dots n \}$

Proof: If $\operatorname{tr}(A^k)=0$ : We will write $\operatorname{tr}(A^k)$ using the eigenvalues of A: $\operatorname{tr}(A^k) = \sum\limits_{i=1}^n \lambda_i^k$.The elementary symmetric polynomials can be expressed using these sums, with constant term $0$. Therefore the elementary symmetric polynomials are equal to $0$. However, they are the coefficients of $p_A$ ( except the leading coefficient ). Therefore $p_A = x^n$ so $A$ is nilpotent.

If $A$ is nilpotent then its eigenvalues are equal to $0$. The traces of $A, A^2, \dots A^n$ are also $0$ as they are sums of the eigenvalues raised to a certain power.