Proof Verification: If $w_1,...,w_k\in V$ in linear independent, then so is $[w_1]_B,...,[w_k]_B$

87 Views Asked by At

Question:

Let $B=\{b_1,...,b_n\}$ be the basis of vector space $V$. Prove that if $w_1,...,w_k\in V$ are linear independent, then so are $[w_1]_B,...,[w_k]_B$.

Preliminaries:

  1. According to the definition of linear dependence, a group of $n$ vectors are linearly dependent if $\exists \alpha_1,...\alpha_n|\alpha_1v_1+...+\alpha_nv_n=0$

  2. According to the definition of coordinate spaces,
    $[w]_B=\left( \begin{array} cc_1\\...\\c_n \end{array} \right)$ such that $w=c_1b_1+...+c_nb_n$.

My Proof:

Proof by negation:

  1. If $w_1,...,w_k\in V$ are linearly independent: $\forall\alpha_1,...,\alpha_k:\alpha_1(c_{1,1}b_1+...+c_{1,n}b_n)+...+\alpha_k(c_{k,1}b_1+...+c_{k.n}b_n)\ne0$
    Moved around a bit to form: $\forall\alpha_1,...,\alpha_k:b_1(\alpha_1c_{1,1}+...+\alpha_kc_{k,1})+...+b_n(\alpha_1c_{1,n}+...+\alpha_kc_{k,n})\ne0$

  2. Let $[w_1]_B,...,[w_k]_B$ be linearly dependent: $\exists\beta_1,...,\beta_k:\beta_1\left( \begin{array} cc_{1,1}\\...\\c_{1,n} \end{array} \right)+...+\beta_k\left( \begin{array} cc_{k,1}\\...\\c_{k,n} \end{array} \right)=\left( \begin{array} c0\\...\\0 \end{array} \right)$
    Which tells us:
    $\begin{cases}\beta_1c_{1,1}+...+\beta_kc_{k,1}=0\\\beta_1c_{1,n}+...+\beta_kc_{k,n}=0\end{cases}$

  3. Let $\forall1\le i\le k:\alpha_i=\beta_i$ causing (1) to become $b_1(0)+...+b(0)=0\ne0$ ('$...$' can be ignored in this case.) which is a contradiction$\blacksquare$

1

There are 1 best solutions below

2
On BEST ANSWER

There are a few issues you have in your proof. One is that you are very cavalier with regards to when things are zero or not. For instance, in $1)$ you should say that at least one of the $\alpha_i$ is not zero. Similarly at least one of the $\beta_i$ must be non-zero.

One glaring problem is your final line when you say "If $\alpha_i = \frac{\beta_i}{b_i}$". This is a huge problem because $\alpha_i$ and $\beta_i$ are scalars and $b_i$ is a vector!


I will present a very straightforward proof that avoids all those pesky subscripts. Let $\tilde{B}$ denote the matrix whose columns are the $b_i$. You can reinterpret the second statement in your preliminaries as $$ w = \tilde{B}[w]_B.$$

Suppose we have scalars $\gamma_1,\dots,\gamma_k$ such that $\gamma_1[w_1]_B + \dots + \gamma_k[w_k]_B = 0$. Then:

\begin{align*} 0 &= \tilde{B}0 \\ &=\tilde{B}\left(\gamma_1[w_1]_B + \dots + \gamma_k[w_k]_B\right)\\ &=\gamma_1(\tilde{B}[w_1]_B) + \dots + \gamma_k(\tilde{B}[w_k]_B) \\ &=\gamma_1w_1 + \dots \gamma_k w_k. \end{align*}

Because $w_1,\dots,w_k$ are linearly independent, we must have that $\gamma_1, \dots, \gamma_k = 0$. Therefore $[w_1]_B,\dots,[w_k]_B$ are linearly independent.


As a side note: We didn't even need to use the fact that $\tilde{B}$ is invertible. Any linear transformation will map dependencies to dependencies. So in this case, if there was a linear dependence among the $[w_i]_B$'s, that would map to a linear dependence among the $w_i$'s which we know does not exist.