If $\{v_1, ... , v_k\}$ independent, then $\{Av_1, ... , Av_k\}$ is independent

547 Views Asked by At

So I have to prove the following:

Given an invertible $n\times n$ matrix $A$ and vectors $v_1, \ldots ,v_k$. If $\{v_1, \ldots , v_k\}$ is independent, then $\{Av_1, ... , Av_k\}$ is independent too.

Can anyone help me? I wanted to use the definition of independency: $x_1v_1 + \cdots + x_kv_k = 0$ if and only if every $x_i=0$. But not sure how to get further.

4

There are 4 best solutions below

0
On

$x_1Av_1 + \cdots + x_kAv_k = 0$ implies $A(x_1v_1+\cdots+x_kv_k) = 0$ which implies $A^{-1}A(x_1v_1+\cdots+x_kv_k) = 0$ which implies $x_1v_1+\cdots+x_kv_k=0$ which implies $x_1=\cdots=x_k=0$.

0
On

Another way to put it, using rank properties of matrices:

Form the $n\times k$ matrix $V$ with the vectors $v_i$ as columns. Now $\{v_1,\ldots,v_k\}$ linearly independent translates to rank$(V)=k$, while $\{Av_1,\ldots,Av_k\}$ linearly dependent translates to rank $AV<k$. But as $V=(A^{-1}A)V=A^{-1}(AV)$, $$k=\text{rank}(V)\leq\min(\text{rank}(A^{-1}),\text{rank}(AV))<k,$$ a contradiction.

0
On

If $A$ is invertible, then multiplication by $A$ is a linear map $K^n\to K^n$, and multiplication by $A^{-1}$ is its inverse, so it is an isomorphism of vector spaces (in other words an automorphism of $K^n$). Properties like being linearly independent are preserved under isomorphisms.

In this case concretely, you might show that some family of scalars $\lambda_1,\ldots,\lambda_k$ define a nontrivial relation between $v_1,\ldots,v_k$ (in other words if the $\lambda_i$ are not all zero and $\lambda v_1+\cdots+\lambda_kv_k=0$) if and only if they define a nontrivial relation between $Av_1,\ldots,Av_k$. This is immediate from linearity.

0
On

Related, but ultimately all the same.

An $n \times n$ matrix $V \to W$ with $V,W$ $n$-dimensional can be realized as a map $T:\mathbb R^n \to \mathbb R^n$, in which case a linear transformation just sends basis elements to some linear combination of other basis elements. The requirementsa

$e_i \mapsto \alpha_{i_1} e_1+\cdots\alpha_{i_n}e_n$, but then take

$\sum_{i} a_i(\alpha_{i_1} e_1+\cdots\alpha_{i_n}e_n)$ and note that we basically still have some linear combination of the original basis vectors.


A related idea to Kenny Lau's answer: recall that a linear map is injective if and only if its kernel is trivial.

Then for some linear combination, $0=a_1Ae_1 \dots a_nAe_n=A(a_1e_1+\dots a_ne_n) \implies a_1 e_1+\dots+a_ne_n=0$ since it is ane element of the kernel.