Show linear independence for a set of increasing powers of linear transformations

905 Views Asked by At

Question: Let V be an n-dimensional vector space, and let $T : V \rightarrow V$ be a linear map. Suppose that there is a vector $v \in V$ such that $T^{n−1}(v) \ne 0$ but $T^n(v) = 0$. Show that the set $\{v, T(v), T^2(v),\cdots, T^{ n−1}(v)\}$ is linearly independent, and hence is a basis of V. Find the matrix of T with respect to this basis (in the given order).

I can see the intuition. Let $T_1$, $T_2$ be the shift transformations defined by $T_1(x_1,x_2,\cdots,x_j) = (0,x_1,\cdots,x_{j-1})$ and similarly, $T_2(x_1,x_2,\cdots,x_j) = (x_2,\cdots,x_{j-1}, 0)$. Then the set (for $T_1$) is $\{v, T(v),\cdots, T^{n-1}(v)\} = \{(x_1,x_2,\cdots,x_j), (0,x_1,\cdots,x_{j-1}),\cdots, (0,0,\cdots,x_1)\}$ and $T_1^n(x)=(0,0,\cdots,0)$ (similarly for $T_2$).

I can see that $T_1$ is linearly independent. But how can we prove this? I think it's similar to the proof of independance for the columns of an upper/lower triangle matrix here but I don't understand that proof.

The matrix is the shift matrix with ones only on the superdiagonal or subdiagonal, and zeroes elsewhere.

2

There are 2 best solutions below

2
On

First Part: Choose $n$ to be minimal and suppose there exists $j$ such that $\{v,T(v), ... T^i(v)...T^{j-1}(v)\}$ is a linearly independent set and adjoining $T^j(v)$ makes the set linearly dependent.

Then $$T^j(v) = \sum_{m=0}^{j-1} a_m T^m(v)$$

By observing that if $T^n(v) = 0$ then $T^{n+m}(v) = 0$ we can apply T to both sides just enough times so that the left hand side is $0$ and all but one of the terms on the right hand side is $0$. In other words apply $T$ to both sides $n-1$ times so that $$0 = T^{j+n-1}(v) = T^n(v) = a_0T^{n-1}(v) \neq 0$$

If $a_0 \neq 0$ this is a contradiction. If $a_0 = 0$ we can apply $T$ to both sides $n-2$ times instead and so on ... up until we do get a contradiction.

Second part: The matrix is quite clear. It's a matrix of $1's$ and $0's$. I think you can take care of it.

0
On

Suppose the set $\{T^j(v), \; 0 \le j \le n - 1\}$ were linearly dependent. Then there exist $a_i \in \Bbb F$, the base field of $V$, not all zero, with

$\displaystyle \sum_0^n a_i T^i(v) = 0; \tag 1$

let $j$ be the least index such that $a_j \ne 0$; if we operate on (1) with $T^{n - j - 1}$, we find

$\displaystyle a_j T^{n - 1}(v) = \sum_0^n a_i T^{n + i -j - 1}(v) = 0; \tag 2$

since, for $¡ > j$,

$T^{n + i - j - 1}(v) = T^{i - j - 1} T^n(v) = 0, \tag 3$

whereas $a_i = 0$ for $i < j$. Together these facts imply (2). Now $a_j \ne 0$ yields

$T^{n - 1}(v) = 0, \tag 4$

in contradiction to our hypothesis; it follows that $\{ T^i(v) \}$ is a linearly independent set and thus a basis of $V$.

As for the matrix of $T$ in the basis $\{T^i(v) \}$, which we also denote by $T$, if we set

$v = e_1 = (1, 0, 0, \ldots, 0)^T, \tag 5$

$T(v) = e_2 = (0, 1, 0, \ldots, 0)^T, \tag 6$

$T^2(v) = e_3 = (0, 0, 1, \ldots, 0)^T, \tag 7$

etc., then it is easy to see the entries $T_{ij}$ of $T$ are

$T_{1j} = 0, \; 1 \le j \le n; \tag 8$

$T_{i, i - 1} = 1, \; 2 \le i \le n; \tag 9$

$T_{ij} = 0, \; j \ne i - 1; \tag{10}$

the matrix described by (8)-(10) has $1$ along the principal subdiagonal and $0$ elsewhere. It is easy to see by direct matrix-vector multiplication that

$T(e_i) = T^i(e_1) = e_{i + 1}, \; 1 \le i \le n - 1, \tag{11}$

and

$T(e_n) = T^n(e_1) = 0, \tag{12}$

in accord with the pattern set by the action of $T$ on $v \in V$.