Calculate determinant with induction

1.1k Views Asked by At

I need to prove the following, with induction to every $1 \leq n$: $$D(a_1,...,a_n) = \left| \begin{array}{ccc} a_1+x& a_2 & a_3 & \cdots & a_n \\ a_1& a_2+x & a_3 & \cdots & a_n \\ a_1& a_2 & a_3+x & \cdots & a_n \\ \vdots & \vdots & \vdots & & \vdots \\ a_1& a_2 & a_3 & \cdots & a_n + x \end{array} \right| = x^n + (a_1 + \cdots + a_n)x^{n-1}$$

I played with it a bit and couldn't find a way to prove it.

This is what I did: I assumed that it is correct for $n$, and tried to solve it for $n+1$

$$D(a_1, \ldots , a_n, a_{n+1}) = \left| \begin{array}{ccc} a_1+x& a_2 & a_3 & \cdots & a_{n+1} \\ a_1& a_2+x & a_3 & \cdots & a_{n+1} \\ a_1& a_2 & a_3+x & \cdots & a_{n+1} \\ \vdots & \vdots & \vdots & & \vdots \\ a_1& a_2 & a_3 & \cdots & a_{n+1} + x \end{array} \right| $$

and I did the following operation on the determinant ($R_{n+1} \to R_{n+1} - R_1$) and got:

$$ \left| \begin{array}{ccc} a_1+x& a_2 & a_3 & \cdots & a_{n+1} \\ a_1& a_2+x & a_3 & \cdots & a_{n+1} \\ a_1& a_2 & a_3+x & \cdots & a_{n+1} \\ \vdots & \vdots & \vdots & & \vdots \\ -x& 0 & \cdots & 0 & x \end{array} \right| $$

And I wasn't sure on how to proceed from here, or even if I'm on the right path.

2

There are 2 best solutions below

0
On BEST ANSWER

Developing with respect to the last row, after performing those elementary row operations (that don't change the determinant), you get $$ D(a_1,\dots,a_n,a_{n+1})=\\ xD(a_1,\dots,a_n)+(-1)^{(n+1)+1}(-x)\det\begin{bmatrix} a_2 & a_3 & \dots & a_n & a_{n+1} \\ a_2+x & a_3 & \dots & a_n & a_{n+1} \\ a_2 & a_3+x & \dots & a_n & a_{n+1} \\ \vdots & \vdots & \ddots & \vdots & \vdots\\ a_2 & a_3 & \dots & a_n+x & a_{n+1} \end{bmatrix} $$ Doing $n-1$ row swaps, the determinant we need is \begin{multline} \det\begin{bmatrix} a_2+x & a_3 & \dots & a_n & a_{n+1} \\ a_2 & a_3+x & \dots & a_n & a_{n+1} \\ \vdots & \vdots & \ddots & \vdots & \vdots\\ a_2 & a_3 & \dots & a_n+x & a_{n+1} \\ a_2 & a_3 & \dots & a_n & a_{n+1} \end{bmatrix}=\\ \det\begin{bmatrix} a_2+x & a_3 & \dots & a_n & a_{n+1} \\ a_2 & a_3+x & \dots & a_n & a_{n+1} \\ \vdots & \vdots & \ddots & \vdots & \vdots\\ a_2 & a_3 & \dots & a_n+x & a_{n+1} \\ a_2 & a_3 & \dots & a_n & a_{n+1}+x \end{bmatrix}-\\ \det\begin{bmatrix} a_2+x & a_3 & \dots & a_n & 0 \\ a_2 & a_3+x & \dots & a_n & 0 \\ \vdots & \vdots & \ddots & \vdots & \vdots\\ a_2 & a_3 & \dots & a_n+x & 0 \\ a_2 & a_3 & \dots & a_n & -x \end{bmatrix}=\\[6px] D(a_2,\dots,a_{n+1})-xD(a_2,\dots,a_n) \end{multline} Therefore $$ D(a_1,\dots,a_n,a_{n+1})= xD(a_1,\dots,a_n)+ x(D(a_2,\dots,a_{n+1})-xD(a_2,\dots,a_n)) $$ By the induction hypothesis, \begin{multline} xD(a_1,\dots,a_n)+ x(D(a_2,\dots,a_{n+1})-xD(a_2,\dots,a_n))=\\ x(x^n+(a_1+\dots+a_n)x^{n-1})+\\ \qquad x(x^n+(a_2+\dots+a_{n+1})x^{n-1}-x^n-(a_2+\dots+a_n)x^{n-1})=\\ x^{n+1}+(a_1+\dots+a_n+a_{n+1})x^n \end{multline}

Note that you have to check the induction basis for $n=1$ and $n=2$, which is easy.

6
On

First, developing with respect to the last column, your $D_{n+1}$ is of the form $Aa_{n+1}+B$. If you put $a_{n+1}=0$, you get $B=xD_n$. Now divide $D_{n+1}$ by $a_{n+1}$ (divide only the entries of the last column) and let $a_{n+1} \to +\infty$. The last column in this new determinant $T$ has now only $1$ as entries. Now in $T$ subtract to the first column $a_1$ times the last column, and do the same for the other columns. Then you find easily that this determinant $T$ is $x^n$, hence $A=x^n$ and $D_{n+1}=x^na_{n+1}+xD_n$, and we are done.