$V$ is $T$ cyclic iff each eigenspace of $T$ is one dimensional.

2.4k Views Asked by At

I am given that $T$, a diagonalizable linear operator on a finite-dimensional vector space $V$, I need to prove that $V$ is $T$ cyclic if and only if each eigenspace of $T$ is one dimensional.

Could anyone help me?

2

There are 2 best solutions below

0
On BEST ANSWER

Since $T$ is diagonalizable, there is a basis of eigenvector $v_1\dots v_n$ to eigenvalues $\lambda_1\dots \lambda_n$.

Let $v=\sum_{i=1}^n a_iv_i$ be a given vector. Then the linear hull of $\{v,Tv,\dots T^{n-1}v\}$ is given by all vectors $w$ for which there exists $b$ such that $$ w = \sum_{j=1}^n b_j T^{j-1}v = \sum_{j=1}^n\sum_{i=1}^na_i b_j T^{j-1}v_i =\sum_{j=1}^n\sum_{i=1}^na_i b_j \lambda_i^{j-1}v_i. $$ The mapping $b\mapsto w$ is a linear mapping between $n$-dimensional spaces. Hence it is invertible if and only if $w=0$ implies $b=0$. Then $$ 0=\sum_{j=1}^n\sum_{i=1}^na_i b_j \lambda_i^{j-1}v_i=\sum_{i=1}^n a_i\sum_{j=1}^n b_j \lambda_i^{j-1}v_i $$ implies $$ a_i\sum_{j=1}^n b_j \lambda_i^{j-1} =0 $$ for all $i$.

Now assume that $T$ is cyclic, where $v$ as above is the corresponding vector. This implies $a_i\ne0$ for all $i$, otherwise $v_i\not\in span\{v\dots T^{n-1}v)$. Then by the calculations above $w=0$ implies $b=0$, hence $b=0$ is the only solution of $\sum_{j=1}^n b_j \lambda_i^{j-1} =0$ for all $i$. This shows that the matrix $$ \pmatrix{1& \lambda_1 & \dots & \lambda_1^{n-1}\\ \vdots &&\vdots\\ 1& \lambda_n & \dots & \lambda_n^{n-1}} $$ is invertible, which is equivalent (Vandermonde matrix) to the fact that the values $\lambda_i$ are distinct. Hence all eigenspaces are one-dimensional.

If the eigenspace are all one-dimensional then choose $v=\sum_{i=1}^n v_i$, and reverse the above argumentation.

0
On

In the following discussion, we take $V$ to be a vector space over some arbitrary field $\Bbb F$, which need not be the rational, real, or complex numbers.

Let $\dim V = n$. Since $T$ is diagonalizable, it is easy to see, and quite well-known, that $T$ has $n$ eigenvalues, which we denote by $\lambda_1$, $\lambda_2$, $\ldots$, $\lambda_n$, and that corresponding to these eigenvalues are $n$ linearly independent eigenvectors $\vec u_i$, $1 \le i \le n$, with

$T\vec u_i = \lambda_i \vec u_i. \tag{1}$

Suppose to begin that each eigenspace of $T$ is one-dimensional. I claim that this implies the $n$ eigenvalues of $T$ are distinct; for if we had $\lambda_i = \lambda_j$, $i \ne j$, then the span $\langle \vec u_i, \vec u_j \rangle$ of $\vec u_i$ and $\vec u_j$, that is, the subspace generated by $\vec u_i$ and $\vec u_j$, is of dimension $2$ since these two vectors are linearly independent. But $\langle \vec u_i, \vec u_j \rangle$ is in fact an eigenspace of $T$, for

$\vec v = v_i \vec u_i + v_j \vec u_j, \tag{2}$

$v_i, v_j \in \Bbb F$, is an eigenvector of $T$:

$T\vec v = T(v_i \vec u_i + v_j \vec u_j) = v_i T\vec u_i + v_j T\vec u_j = v_i \vec \lambda_i u_i + v_j \lambda_j \vec u_j$ $ = v_i \vec \lambda_i u_i + v_j \lambda_j \vec u_j = \lambda_i (v_i \vec u_i + v_j \vec u_j) = \lambda_i \vec v, \tag{3}$

since $\lambda_i = \lambda_j$. The two-dimensionality of $\langle \vec u_i, \vec u_j \rangle$ contradicts the hypothesis that every eigenspace of $T$ is of dimension one, and this contradiction establishes that the $\lambda_i$ are distinct.

Now pick any vector $\vec v \in V$ such that

$\vec v = \sum_1^n v_k \vec u_k \tag{4}$

with $v_k \ne 0$ for $1 \le k \le n$, and let

$p_j(T) = \prod_{i = 1, i \ne j}^n (T - \lambda_i); \tag{5}$

it is relatively easy to see that $p_j(T)$ is a linear combination of the linear operators $I$, $T$, $T^2$, $\ldots$ , $T^{n - 1}$, and that

$p_j(T)\vec v = \prod_{i = 1, i \ne j}^n (T - \lambda_i) \sum_1^n v_k \vec u_k = \sum_1^n v_k \prod_{i = 1, i \ne j}^n (T - \lambda_i) \vec u_k, \tag{6}$

and also that

$p_j(T)\vec u_k = \prod_{i = 1, i \ne j}^n (T - \lambda_i) \vec u_k = 0, k \ne j, \tag{7}$

and

$p_j(T)\vec u_k = \prod_{i = 1, i \ne j}^n (T - \lambda_i) \vec u_k = \prod_{i = 1, i \ne j}^n (\lambda_j - \lambda_i)\vec u_k, k = j; \tag{8}$

from (6), (7), and (8) we conclude

$p_j(T)\vec v = v_j \prod_{i = 1, i \ne j}^n (\lambda_j - \lambda_i)\vec u_j, \tag{9}$

or

$\vec u_j = \dfrac{1}{ v_j \prod_{i = 1, i \ne j}^n (\lambda_j - \lambda_i)}p_j(T) \vec v. \tag{10}$

Formula (10) shows that every vector $\vec u_j$ in the eigenbasis of $T$ may be expressed as a linear combination of $\vec v, T\vec v, T^2\vec v, \ldots, T^{n - 1}\vec v$; since the $\vec u_j$ span $V$, so do the $T^i v$, $0 \le i \le n - 1$, i.e. $\vec v$ is a cyclic vector for $T$ in $V$.

Going the other way, suppose that $T$ has an eigenspace of dimension greater than one; to be specific, suppose that there is an eigenvalue $\lambda_1$ whose eigenspace $W_1$ is of dimension $m_1$, where $2 \le m_1 \le n$. We observe that $T - \lambda_1$ annihilates this entire eigengspace; that is,

$(T - \lambda_1) \vec w = 0 \tag{11}$

for $\vec w \in W_1$. Likewise, if $\lambda_2, \lambda_3, \ldots, \lambda_l$ are the remaining eigenvalues of $T$ with eigenspaces $W_2, W_2, \ldots, W_l$, where now $l < n$ since $\dim W_1 = m_1 \ge 2$, we have

$(T - \lambda_k) \vec w = 0 \tag{12}$

for $\vec w \in W_k$. Note that these assertions easily and immediately follow from the diagonal representation of $T$, which we know exists by hypothesis. Proceeding further in this direction, we conclude that, setting

$q(T) = \prod_1^l (T - \lambda_i), \tag{13}$

we have

$q(T) \vec v = \prod_1^l (T - \lambda_i) \vec v = 0 \tag{14}$

for every $\vec v \in V$, since every such $\vec v$ may be written

$\vec v = \sum_1^l \vec w_i \tag{15}$

with $\vec w_i \in W_i$. However, the degree in $T$ of the polynomial $q(T)$ is at most $n - 1$, since $\dim W_1 = m_1 \ge 2$. This in turn implies that for any $\vec v \in V$ at most $n -1$ of the vectors $T^i \vec v$, $i \ge 0$, may be linearly independent, since $T^{n - 1} \vec v$ (and hence $T^j \vec v$ for $j \ge n$) is expressable as a linear combination of the $T^i \vec v$ with $0 \le i \le n - 2$. Thus for any $\vec v \in V$ the span of the set ${T^i \vec v,i \ge 0}$ is of dimension at most $n - 1$, and hence cannot be all of $V$; $\vec v$ cannot be cyclic for $T$. This proves the existence of a cyclic vector implies every eigenspace of $T$ to be one-dimensional.

We note in closing that equation (4) in fact provides a recipe for the construction of a cyclic vector in the event the eigenspaces of $T$ are one-dimensional.