Calculate the determinant of a special matrix

73 Views Asked by At

I am having some difficulties to calculate a determinant. The matrix is an $n\times n$ one, but I will show the particular case $5\times 5$ in order to simplify the notation.

$\begin{pmatrix} 0 & 1 & 1 & 1 & 1\\ 1 & 0 & x & x & x\\ 1 & x & 0 & x & x\\ 1 & x & x & 0 & x\\ 1 & x & x & x & 0 \end{pmatrix}$

It seems easy but I am missing a step. If for every $i\geq 2$ I add $-x$ times the first row to the $i$th row then I obtain the following matrix:

$\begin{pmatrix} 0 & 1 & 1 & 1 & 1\\ 1 & -x & 0 & 0 & 0\\ 1 & 0 & -x & 0 & 0\\ 1 & 0 & 0 & -x & 0\\ 1 & 0 & 0 & 0 & -x \end{pmatrix}$

and now I can get a recurrence: if $a_n$ is the $n$th determinant, then if I calculate the $n+1$th determinant in terms of the last row, I come up with the formula $a_{n+1}=\left (-1\right )^{n+2}\left (-x\right )^{n-1}-xa_n=-x^{n-1}-xa_n$.

Now, if I consider some small cases I get $a_1=0$, $a_2=-1$, $a_3=2x$, $a_4=-3x^2$, $a_5=4x^3$, $a_6=-5x^4$. It seems to be $a_n=\left (-1\right )^{n+1}\left (n-1\right )x^{n-2}$.

But my problem is that my recursive formula did not help me to prove it by induction. Since I don't think that the conjecture about the general term is wrong, I am for sure commiting a mistake on the recursive formula, but I can't see where exactly. Any help?

2

There are 2 best solutions below

3
On BEST ANSWER

Hint Suppose the formula holds for the $n \times n$ matrix $A_n$ generalizing the matrix in your second display equation.

Then, we can calculate the determinant of $A_{n + 1}$ by cofactor expansion along the last column. Only the first and the last entries of the last column are zero, and the minor corresponding to the last entry is exact $A_n$, so the cofactor expansion simplifies to $$\det A_{n + 1} = (-1)^n \det \pmatrix{1 & -x & \cdots & 0 \\ \vdots & \vdots & \ddots & 0 \\ 1 & 0 & \cdots & -x \\ 1 & 0 & \cdots & 0} - x \det A_n .$$ Can you compute the determinant in the first term on the r.h.s. and then complete the induction?

0
On

An idea: subtract line two from lines 3,4,5 and cevelop by column 1:

$$\begin{vmatrix} 0 & 1 & 1 & 1 & 1\\ 1 & 0 & x & x & x\\ 1 & x & 0 & x & x\\ 1 & x & x & 0 & x\\ 1 & x & x & x & 0\end{vmatrix}=\begin{vmatrix} 0 & 1 & 1 & 1 & 1\\ 1 & 0 & x & x & x\\ 0 & x &-x & 0 & 0\\ 0 & x & 0 &-x & 0\\ 0 & x & 0 & 0 & -x\end{vmatrix}=-\begin{vmatrix} 1 & 1 & 1 & 1\\ x &-x & 0 & 0\\ x & 0 &-x & 0\\ x & 0 & 0 & -x\end{vmatrix}\stackrel{C_1\to C_1+(C_3+C_4)}=$$$${}$$

$$=-\begin{vmatrix} 3 & 1 & 1 & 1\\ x &-x & 0 & 0\\ 0 & 0 &-x & 0\\ 0 & 0 & 0 & -x\end{vmatrix}=x\begin{vmatrix} 3 & 1 & 1 \\ x &-x & 0 \\ 0 & 0 &-x \end{vmatrix}=-x^2\begin{vmatrix}3&1\\x&\!-x\end{vmatrix}=-x^2(-3x-x)=$$$${}$$

$$=x^3(x+3)$$