Prove that the set $\left\{ x, Ax, \dots, A^{k-1} x \right\}$ is linearly independent

2.1k Views Asked by At

Problem: Let $A\in M_{n\times n}(\mathbb R)\,$ be a matrix and suppose that a positive number $k\,$ exists such that $A^k = 0\,$ and $A^{k-1} \neq 0$. Suppose that $x=\left[ \begin{matrix} x_1 \\ \vdots \\ x_n \end{matrix} \right]$ is a vector in $\mathbb{R^n}$ such that $A^{k-1} x \neq 0$.
Prove that the $k\,$ vectors $\,x,Ax,\dots,A^{k-1}x\,$ are linearly independent.

My attempt: Suppose $x + Ax + \dots + A^{k-1}x = 0$. Multiply both sides with $A^{k-1}$. Then we have $A^{k-1}x + A^k (x + Ax + \dots + A^{k-2}x) = 0 \Leftrightarrow A^{k-1}x = 0 \Leftrightarrow x = 0$
which implies $x + Ax + \dots + A^{k-1}x\,$ is linear independent.

This problem looks quite easy but I want my proof to be checked. Is it correct?

2

There are 2 best solutions below

2
On BEST ANSWER

Take $\alpha_0,\ldots,\alpha_{k-1}\in\mathbb R$ and suppose that$$\alpha_0x+\alpha_1Ax+\alpha_{k-1}A^{k-1}x=0.\tag1$$Then $A^{k-1}(\alpha_0x+\alpha_1Ax+\alpha_{k-1}A^{k-1}x)=0$, but this means that $\alpha_0A^{k-1}x=0$ and, since $A^{k-1}x\neq0$, $\alpha_0=0$. So, $(1)$ means that$$\alpha_1Ax+\alpha_2A^2x+\alpha_{k-1}A^{k-1}x=0.\tag2$$Now, start all over again, multiplying $(2)$ by $A^{k-2}$ and so on.

0
On

Assume that $\{x, Ax, \ldots, A^{k-1}x\}$ is linearly dependent. Let $p \in \mathbb{C}[x]$ be the nonzero polynomial of minimal degree such that $p(A)x = 0$ and $\deg p \le k-1$.

From $A^k = 0$ and $A^{k-1} \ne 0$ we see that the minimal polynomial of $A$ is $m_A(t) = t^{k-1}$. We have $m_A(A)x = 0$ so $p$ divides $m_A$.

Hence $p(t) = t^j$ for some $1 \le j \le k-1$. Therefore $A^jx = p(A)x = 0$ which is a contradiction with $A^{k-1}x \ne 0$.