Let $T:V\rightarrow V$ be a linear transformation such that $T^2=2T$.
Prove the following statements:
- If $B=(b_1,...,b_k)$ is a basis of KerT and $C=(c_1,...c_{n-k})$ is a basis of ImT, then $(b_1,...,b_k,c_1,...,c_{n-k})$ is a basis of V. (I assumed dimV=n)
- Show that there exists a basis of V, A, such that the representing matrix of T- $[T]_A$ is diagonal and the diagonal entries are all in {0,2}
Hi all. I've only just started my Linear Algebra course and we are yet to learn diagonalizing matrices (and shouldn't use this method for this problem). for 1, I assumed $(b_1,...,b_k,c_1,...,c_{n-k})$ is a linear dependent set and therefore there exists $\beta_j\neq 0$ such that $-\sum_{i=1}^{k}\frac{\alpha_i}{\beta_j}\vec b_i-\sum_{i\neq j}^{n-k}\frac {\beta_i}{\beta_j}\vec c_i = \vec c_j$. I've reached the point where I show that $T(\vec c_j) = \sum_{i\neq j}^{n-k}\frac {\beta_i}{\beta_j}T(\vec c_i)$
How do I continue from here? Would like some help. (and also on question 2)
thanks in advance :)
Indeed, we need to show that the set $B \cup C$ is linearly independent. It seems that you wanted to prove this by contradiction (i.e. starting by assuming that $B \cup C$ is linearly dependent), but I don't like this approach.
A nicer approach here is to simply show that if $x \in \ker T$ and $y \in \operatorname{Im}(T)$ are such that $x + y = 0$, then it must be that $x = y = 0$ (to put it another way: we'd like to show that $\ker T \cap \operatorname{Im}(T) = \{0\}$). With this shown, we can conclude that if $$ \sum \beta_i b_i + \sum \gamma_i c_i = 0 $$ then it must be the case that $\sum \beta_i b_i = 0$ and $\sum \gamma_i c_i = 0$, which (by the linear independence of the individual bases) implies that all $\beta_i$ and $\gamma_i$ are $0$. So,
Proof of claim: Let $v$ be such that $y = Tv$. Note that $x + y = 0$. Thus, $T(x + y) = T(0) = 0$. So, we have $$ 0 = T(x + y) = Tx + Ty = 0 + T(Tv) = T^2 v = 2Tv = 2y $$ So, $2y = 0$, which means $y = 0$. Since $x + y = 0$, conclude that $x = 0$ as well. $\square$
Next, we need to show that every vector in $v \in V$ can be written in the form $v = x + y$ where $x \in \ker(T)$ and $y \in \operatorname{Im}(T)$. There are several approaches that work here, but there are two I like: once is to simply argue that $\ker(T) \cap \operatorname{Im}(T) = \{0\}$, which means that $\ker(T) + \operatorname{Im}(T)$ is necessarily an $n$-dimensional subspace of $V$. The second approach is to note that every $v \in V$ can be decomposed as $$ v = \underbrace{\left(v - \frac 12 Tv\right)}_x + \underbrace{\frac 12 Tv}_{y} $$
Part 2 is easy if you understand what it means to write the matrix of a transformation with respect to a particular basis. In particular, you should argue that the matrix of $T$ with respect to $A = (b_1,...,b_k,c_1,...,c_{n-k})$ is of the desired diagonal form.