Problem: Let $A\in M_{n\times n}(\mathbb R)\,$ be a matrix and suppose that a positive number $k\,$ exists such that $A^k = 0\,$ and $A^{k-1} \neq 0$.
Suppose that $x=\left[ \begin{matrix} x_1 \\ \vdots \\ x_n \end{matrix} \right]$ is a vector in $\mathbb{R^n}$ such that $A^{k-1} x \neq 0$.
Prove that the $k\,$ vectors $\,x,Ax,\dots,A^{k-1}x\,$ are linearly independent.
My attempt: Suppose $x + Ax + \dots + A^{k-1}x = 0$. Multiply both sides with $A^{k-1}$. Then we have $A^{k-1}x + A^k (x + Ax + \dots + A^{k-2}x) = 0 \Leftrightarrow A^{k-1}x = 0 \Leftrightarrow x = 0$
which implies $x + Ax + \dots + A^{k-1}x\,$ is linear independent.
This problem looks quite easy but I want my proof to be checked. Is it correct?
Take $\alpha_0,\ldots,\alpha_{k-1}\in\mathbb R$ and suppose that$$\alpha_0x+\alpha_1Ax+\alpha_{k-1}A^{k-1}x=0.\tag1$$Then $A^{k-1}(\alpha_0x+\alpha_1Ax+\alpha_{k-1}A^{k-1}x)=0$, but this means that $\alpha_0A^{k-1}x=0$ and, since $A^{k-1}x\neq0$, $\alpha_0=0$. So, $(1)$ means that$$\alpha_1Ax+\alpha_2A^2x+\alpha_{k-1}A^{k-1}x=0.\tag2$$Now, start all over again, multiplying $(2)$ by $A^{k-2}$ and so on.