Showing a Matrix is Nilpotent of a certain degree using only properties of matrix multiplication/summations

259 Views Asked by At

Suppose I had an $n \times n$ matrix $$N = \begin{pmatrix} 0 & 1 & 0 & 0 & \cdots & 0 & 0\\0 & 0 & 1 & 0 & \cdots & 0 & 0 \\ 0 & 0 & 0 & 1 & \cdots & 0 & 0 \\\vdots & \vdots & \vdots & \vdots & & \vdots& \vdots \\ 0 & 0 & 0 & 0 & \cdots & 1 & 0\\ 0 & 0 & 0& 0& \cdots & 0 & 0 \end{pmatrix}$$

i.e., a matrix where every entry is zero, except for along the superdiagonal, where all entries are 1, and I wanted to show that $N^{k}=0$ for $k\geq n$ but $N^{k}\neq 0$ for $k<n$.

What is a slick way to prove this without using Nilpotency, linear operators, or the Cayley-Hamilton Theorem? I.e., a way to prove this purely using summations and the properties of matrix multiplication? I've been trying to have a go at it, but it keeps getting very cumbersome.

I know that if you start with $N^{2}$, the diagonal of $1$'s shifts up to the $n_{1,3}, n_{2,4}, \cdots , n_{n-2,n}$ superdiagonal, until eventually, for $N^{n-1}$, all we're left with is a single $1$ in the $n_{1,n}$ entry, but I don't know how to prove it in the way that I am asking.

Could someone please help?

4

There are 4 best solutions below

0
On BEST ANSWER

We have $Ne_1=0$ and $Ne_j=e_{j-1}$ for $j\ge 2$. Then

$N^2e_2=Ne_1=0$

$N^3e_3=N^2e_2=0$

$\cdots$

$N^ne_n=N^{n-1}e_{n-1}=0$

and so $N^ne_j=0$ for all $j=1,\dots,n$.

0
On

Start with a random vector in floating point, apply matrix-vector-multiplication until zero vector. Keep track of how many multiplications were required. Unless you were incredibly badoozely unlucky of having gotten your vector in a subspace of a subset of the Jordan transform spanning vectors this will work.

0
On

If $\{E_{k,j}\}$ denote the matrix units, you have $$ E_{k,j}E_{s,t}=\delta_{j,s}E_{k,t}. $$ We can write $$ N=\sum_{k=1}^{n-1}E_{k,k+1}. $$ Now you can compute $$ N^2=\sum_{j,k=1}^{n-1}E_{k,k+1}E_{j,j+1}=\sum_{k=1}^{n-2}E_{k,k+2}. $$ Now by induction, we show that for $m<n$. $$ N^m=\sum_{k=1}^{n-m}E_{k,k+m} $$ In particular, $$ N^{n-1}=E_{1,n}, $$ and $$ N^n=\sum_{k=1}^{n=1}E_{k,k+1}E_{1,n}=0.$$

2
On

From the usual definition $c_{ij}=\sum_k a_{ik}b_{kj}$ of the product of two matrices, we can deduce that each column of the matrix product $AB$ is a linear combination of the columns of $A$ with coefficients given by the corresponding column of $B$. There’s a similar conclusion about the rows of the product. From this point of view you can see that $N$ is a shift operator, from which nilpotency shouldn’t be too hard to prove.