Question: Let V be an n-dimensional vector space, and let $T : V \rightarrow V$ be a linear map. Suppose that there is a vector $v \in V$ such that $T^{n−1}(v) \ne 0$ but $T^n(v) = 0$. Show that the set $\{v, T(v), T^2(v),\cdots, T^{ n−1}(v)\}$ is linearly independent, and hence is a basis of V. Find the matrix of T with respect to this basis (in the given order).
I can see the intuition. Let $T_1$, $T_2$ be the shift transformations defined by $T_1(x_1,x_2,\cdots,x_j) = (0,x_1,\cdots,x_{j-1})$ and similarly, $T_2(x_1,x_2,\cdots,x_j) = (x_2,\cdots,x_{j-1}, 0)$. Then the set (for $T_1$) is $\{v, T(v),\cdots, T^{n-1}(v)\} = \{(x_1,x_2,\cdots,x_j), (0,x_1,\cdots,x_{j-1}),\cdots, (0,0,\cdots,x_1)\}$ and $T_1^n(x)=(0,0,\cdots,0)$ (similarly for $T_2$).
I can see that $T_1$ is linearly independent. But how can we prove this? I think it's similar to the proof of independance for the columns of an upper/lower triangle matrix here but I don't understand that proof.
The matrix is the shift matrix with ones only on the superdiagonal or subdiagonal, and zeroes elsewhere.
First Part: Choose $n$ to be minimal and suppose there exists $j$ such that $\{v,T(v), ... T^i(v)...T^{j-1}(v)\}$ is a linearly independent set and adjoining $T^j(v)$ makes the set linearly dependent.
Then $$T^j(v) = \sum_{m=0}^{j-1} a_m T^m(v)$$
By observing that if $T^n(v) = 0$ then $T^{n+m}(v) = 0$ we can apply T to both sides just enough times so that the left hand side is $0$ and all but one of the terms on the right hand side is $0$. In other words apply $T$ to both sides $n-1$ times so that $$0 = T^{j+n-1}(v) = T^n(v) = a_0T^{n-1}(v) \neq 0$$
If $a_0 \neq 0$ this is a contradiction. If $a_0 = 0$ we can apply $T$ to both sides $n-2$ times instead and so on ... up until we do get a contradiction.
Second part: The matrix is quite clear. It's a matrix of $1's$ and $0's$. I think you can take care of it.