Show that if $\{Av_1,\ldots,Av_n\}$ is linearly independent then $A$ is non-singular

1k Views Asked by At

I want to proof the following.

Let $A$ be $n \times n$ and $\mathbf{v}_1,\ldots,\mathbf{v}_n \in \Bbb R^n$. Show that, if $\{A\mathbf{v}_1,\ldots,A\mathbf{v}_n\}$ is linearly independent, then $A$ is non-singular.


I tried the direct proof but I'm struggling at one step:

Let $\{A\mathbf{v}_1,\ldots,A\mathbf{v}_n\}$ be linearly independent, then $c_1A\mathbf{v}_1 + \ldots + c_nA\mathbf{v}_n = \mathbf{0}$ only for all $c_i=0$. By linearity $A(c_1\mathbf{v}_1 + \ldots + c_n\mathbf{v}_n) = \mathbf{0}$ only for all $c_i=0$. [How do I conclude that $A\mathbf{x}=\mathbf{0}$ only for $\mathbf{x}=\mathbf{0}$?]. Since $A\mathbf{x}=\mathbf{0}$ only for $\mathbf{x}=\mathbf{0}$, we have $\mathrm{rank}(A)=n$ and therefore $A$ is non-singular.

I tried also to prove it by showing the contrapositive but there I get stuck too:

Let $A$ be singular, then $\mathrm{rank}(A)<n$, so $A\mathbf{x}=\mathbf{0}$ has non-trivial solutions $\mathbf{x} \neq \mathbf{0}$. Suppose $c_1A\mathbf{v}_1 + \ldots + c_nA\mathbf{v}_n = \mathbf{0}$. [How do I know that one of the $\mathbf{v}_i$ is a non-trivial solution?]. Since $\mathbf{v}_i$ is a non-trivial solution of $A\mathbf{x}=\mathbf{0}$ we can choose $c_i \neq 0$ and all other $c_j=0$, so that we have found a non-trivial linear combination of the zero vector $c_1A\mathbf{v}_1 + \ldots + c_nA\mathbf{v}_n = \mathbf{0}$. Therefore $\{A\mathbf{v}_1,\ldots,A\mathbf{v}_n\}$ is linearly dependent.

Can somebody give me a hint?

4

There are 4 best solutions below

0
On BEST ANSWER

Since $\{Av_1,...,Av_n\}$ is linearly independent then $\{v_1,...,v_n\}$ is linearly independent. Suppose other wise. Let $a_1,...,a_n\in\mathbb{R}$ not all zero such that $a_1v_1+...+a_nv_n=0$ then by linearity of $A$ one gets $$0=A(0)=A(a_1v_1+...+a_nv_n)=a_1Av_1+...+a_nAv_n$$ thus a contradiction. Having established linear independence of $\{v_1,...,v_n\}$ we claim that this set spans $\mathbb{R}^n$. This is immediate since there are $n$ such vectors and can easily construct an isomorphism $\varphi:\mathbb{R}^n\to\mathbb{R}^n$ such that $\varphi(e_k):=v_k$ for $k=1,...,n$ where $\{e_k\}$ is the usual basis of $\mathbb{R}^n$. Now suppose $A$ is singular then there is an $x\in\mathbb{R}^n$ and $x\neq0$ such that $Ax=0$. But there are constants $\{b_k\}$ not all zero with $x=b_1v_1+...+b_nv_n$. This implies $$0=Ax=A(b_1v_1+...+b_nv_n)=b_1Av_1+...+b_nAv_n$$ which is a contradiction to linear independence of $\{Av_1,...,Av_n\}$.

1
On

I assume that the $ \mathbf{v}_i $'s are the basis vectors of $ \mathbb{R}^n $. If so, let $ \mathbf{x} \in \mathbb{R^n} $ such that $ A\mathbf{x} = 0 $. Then since $ \mathbf{v}_i $'s form a basis, there exists $ \lambda_i \in \mathbb{R} $ such that $ \mathbf{x} = \sum_{i=1}^{n}\lambda_i \mathbf{v}_i $. Then by linearity: $$ 0 = A\mathbf{x} = \lambda_1A\mathbf{v}_1 + \ldots + \lambda_nA\mathbf{v}_n. $$ Then use the linear independence of $ A\mathbf{v}_i $'s to conclude that $ \mathbf{x} = 0 $.

Hope this helps.

Edit: As pointed out, you can in fact show that the list of $ \mathbf{v}_i $'s form a basis of $ \mathbb{R}^n $, by first noticing that since $ A\mathbf{v}_i $'s are a list $ n $ linearly independent vectors, then the $ \mathbf{v}_i $'s are also a list of $ n $ linearly independent vectors, and hence form a basis of $ \mathbb{R}^n $ (since $ \dim\mathbb{R}^n = n $).

0
On

Depending on how much you know, there are some quick proofs. If you know the invertible matrix theorem and basis theorem, then you can get this directly.

Since $\{A\bf{v}_1,\dots,A\bf{v}_n\}$ are linearly independent in $\mathbb{R}^n$, they form a basis. Therefore, the image of $A$ contains a basis for $\mathbb{R}^n$, so the columns of $A$ span $\mathbb{R}^n$. This is one of the statements of the invertible matrix theorem, so $A$ is invertible. Finally, being invertible is the same as being nonsingular.

0
On

By the rank-nullity theorem: rank = $\dim$ of image $= n\implies\dim\ker A =0$.