Multiplicity of factors of a determinant by observation

214 Views Asked by At

Consider the determinant

$$ \Delta = \begin{vmatrix} x+1 & 3 & 5 \\ 2 & x+2 & 5 \\ 2 & 3 & x+4 \end{vmatrix}$$

It can be explicitly shown to be $$\Delta = (x-1)^2(x+9)$$

Suppose we want to write down this value without direct calculation.

On putting $x=1$, first column, $C_1$, becomes a scalar multiple of other columns, $C_2$ as well as $C_3$ and along the way so do $C_2$ and $C_3$, making the determinant $ \Delta = 0$.

What can be concluded from this? Can we say that $(x-1)^2$ is a factor of $\Delta$, i.e., with multiplicity $2$?

But clearly $(x-1)^3$ not a factor of $\Delta$.

Can multiplicity of $(x-x_0)$ in $\Delta$ can be attempted by such an observation? If so, what is the highest power of $(x-x_0)$ that can be concluded correctly for $n \times n$ determinant when $k$ out of $n$ columns are scalar multiples of each other on substituting $x=x_0$?

Please explain for the above $3 \times 3$ case before $n \times n$ case. Your help is appreciated.

2

There are 2 best solutions below

3
On BEST ANSWER

The rank of the matrix you get is what gives you hints about how many times a root appears. In general, the rule is:

If setting $x=x_0$ gives us an $n \times n$ matrix with rank $n-k$, then there is a factor of $(x-x_0)^k$ in the determinant.

This applies to any matrix where the entries are linear expressions in $x$ (such as $x+1$ or $3$).

In your example, setting $x=1$ gives us a $3 \times 3$ matrix with rank $1$: all rows are multiples of the same row vector. Since $1$ can be written as $3-2$, we know that there is a factor of $(x-1)^2$ in the determinant.


Let me try to explain the logic behind the rule in the $3 \times 3$ example. The argument generalizes, but the generalization is harder to read.

Suppose we use a different variable in each row: for example, $$ \Delta(x,y,z) = \begin{vmatrix} x+1 & 3 & 5 \\ 2 & y+2 & 5 \\ 2 & 3 & z+4 \end{vmatrix} $$ The polynomial $\Delta(x,y,z)$ cannot possibly have a factor of $x^2$, $y^2$, or $z^2$ appear anywhere. (This is true here because there's only one of each variable. It's true in general if each variable only appears in one row.) This means that we can write $$ \Delta(x,y,z)=A + B(x-1) + C(y-1) + D(z-1) + E(x-1)(y-1) + F(x-1)(z-1) + G(y-1)(z-1) + H(x-1)(y-1)(z-1) $$ where $A,B,C,D,E,F,G,H$ are constants. Because $\Delta(1,1,1)=0$, we know $A=0$.

Because the matrix with $x=y=z=1$ has rank $1$, we know that actually $\Delta(1,1,z)=0$ for any $z$: if we set $x=y=1$, the first two rows are already multiples of each other, so the determinant is $0$. Similarly, $\Delta(1,y,1)=0$ for any $y$ and $\Delta(x,1,1)=0$ for any $x$. This tells us that in the expansion above, $A=B=C=D=0$.

Therefore $$ \Delta(x,y,z) = E(x-1)(y-1) + F(x-1)(z-1) + G(y-1)(z-1) + H(x-1)(y-1)(z-1) $$ which means that $$ \Delta(x,x,x) = (E+F+G)(x-1)^2 + H(x-1)^3 $$ which is divisible by $(x-1)^2$.

3
On

Consider the matrix $$ A= -\begin{bmatrix} 1 & 3 & 5\\ 2 & 2 & 5\\ 2 & 3 & 4 \end{bmatrix}\ . $$ The OP wants to compute the characteristic polynomial $P_A$ of this matrix, computed in $x$, so $P_A(x)$.

The factor $(x-1)$ in $P_A$ was seen because $P_A(1)=\det(I-A)=0$, because we have a column relation for $I-A$, which is explicitly $$ I-A= \begin{bmatrix} 2 & 3 & 5\\ 2 & 3 & 5\\ 2 & 3 & 5 \end{bmatrix} = \begin{bmatrix} 1\\ 1\\ 1 \end{bmatrix} \begin{bmatrix} 2 & 3 & 5 \end{bmatrix}\ . $$ But we can exhibit with bare eyes "not only one" linear relation between the column vectors $C_1$, $C_2$, $C_3$ of $\pm A$, but two of them, moreover two independent linear relations, e.g. $3C_1-2C_2=0$ and $5C_1-2C_3=0$. Equivalently, the two linearly independent column vectors $v=[3\ -2\ 0]^T$ and $w=[5\ 0\ -2]^T$ are eigenvectors for the eigenvalue $1$, i.e. $(I-A)v=(I-A)w=0$. So $1$ has an eigenspace of dimension at least $2$, i.e. is an eigenvalue of multiplicity at least two. So $(x-1)^2$ divides $P_A(x)$.

To see with bare eyes it is not three, there are some chances...

  • Look for an other eigenvalue, we will have it soon, $-9$.
  • The Jordan form of the quasi-projector $(I-A)$ is diagonal, and the kernel, the corresponding $1$-eigenspace does not have full dimension three, else we would have $(I-A)=0$.
  • $P_A(x)$ is a polynomial in $\Bbb Z[x]$, so we may want to compute it modulo some small prime like two or three. It turns out that modulo three $P_A(0)$ is $$ \pm\begin{vmatrix} 1 & 3 & 5\\ 2 & 2 & 5\\ 2 & 3 & 4 \end{vmatrix} \equiv \pm\begin{vmatrix} 1 & 0 & -1\\ * & 2 & *\\ -1 & 0 & 1 \end{vmatrix} \equiv 0\pmod3\ . $$ So $P_A(x)=(x-1)^3$ is excluded by passing to $\Bbb Z/3$. The above is easy for the bare eyes, too.

One sees more or less easily, that $-9$ is an eigenvector of $A$, because we consider $$ (-9)I-A = \begin{bmatrix} -8 & 3 & 5\\ 2 & -7 & 5\\ 2 & 3 & -5 \end{bmatrix}\ , $$ and observe that the sum on each row is zero. (So the sum of the columns is the zero vector.) This means that $x-(-9)=x+9$ is also a factor of $P_A(x)$. This characteristic polynomial is monic, so $P_A(x)=(x-1)^2(x+9)$.

A comment complains that the eigenvalue $-9$ is "not easy". Well, it "is easy" - if we manage to observe that $A$ has same sum in each row. But OK, assume this is too complicated. So far we know that $P_A(x)=(x-1)^2(x-a)$ with some eigenvalue $a$ to be still determined. Then just compute $P_A(0)=\det(-A)$ or some other value of $P_A$.


For the "general" case of an $n\times n$ matrix we have the same situation, compute (generalized) eigenspaces for a given eigenvalue. (Either with bare eyes, or by solving linear systems of equations explicitly.)