How to show linear independence of three elements connected by a linear transformation.

72 Views Asked by At

Let $V$ be a three dimensional vector space over the field $\mathbb{Q}.$ Suppose $T: V \to V$ be a linear transformation and $T(x)=y,$ $T(y)=z,$ $T(z)=x+y,$ for certain $x,y,z \in V,x \neq0.$ Prove that $x,y$ and $z$ are linearly independent.

What I did: Consider a linear homogeneous relation $ax+by+cz=0$ for $a,b,c \in \mathbb{Q}.$ Since $x\neq0,$ we also have $z-x \neq 0.$ We also get $(aI + bT +cT^2)(x)=0.$ After that I cannot conclude anything. I need some help.

1

There are 1 best solutions below

0
On BEST ANSWER

Note: I'm not sure I ever use that $V$ is 3-dimensional below. This makes me concerned; be warned. But I'm pretty convinced the argument is correct.

First let's show that $x$ and $y$ are linearly independent. Proceed by contradiction. Suppose instead $x$ and $y$ are linearly dependent, which means that $y=ax$ for some $a \in \mathbb{Q}$ (since $x$ is not zero, $x$ and $y$ linearly dependent $\Rightarrow y = ax$ for some $a \in \mathbb{Q}$).

Now $z = T(y) = T(ax) = aT(x) = a^2x$. And $x+y = T(z) = T(a^2x) = a^3x$. So $a^3x = x+y = x+ax \Rightarrow (a^3-a-1)x = \mathbf{0}$. Since $x$ is not $\mathbf{0}$, we must have that $a^3 - a + 1 = 0 $; but this is impossible for $a \in \mathbb{Q}$ (by the rational root theorem, checking that $\pm 1$ are not roots).

Ok so we've shown that $x$ and $y$ must be linearly independent.

Let's now show that $x$, $y$, and $z$ are linearly independent. Proceed by contradiction again. Suppose instead that $x$, $y$, and $z$ are linearly dependent, which means that $z = ax + by$ for some $a,b \in \mathbb{Q}$ (since we know $x$ and $y$ are linearly independent, we can guarantee this is possible).

Now $$x+y = T(z) = T(ax + by) = aT(x) + bT(y) = ay + bz = ay + b(ax+by) = (ab)x + (a+b^2)y.$$ Rearranging gives $(ab-1)x + (a+b^2-1)y = \mathbf{0}$. Since $x$ and $y$ are linearly independent, we have then that $ab-1 = 0$ and $a+b^2-1 = 0$.

Now $ab-1 = 0 \Rightarrow ab = 1 \Rightarrow b = 1/a$ and $a \neq 0$ and $b \neq 0$.

Plugging this into the second equation, we have $a + (1/a)^2 - 1 = 0 \Rightarrow a^3 - a^2 + 1 = 0$. And again, this is impossible for $a \in \mathbb{Q}$ (by the rational root theorem).