Let $V$ be a vector space; prove the following statements:

212 Views Asked by At

$(a)$ Let $V$ be a vector space and let $M \subseteq V$, $v \in V\setminus M$, $M$ linearly independent Then: $M \cup {v}$ is linearly independent if and only if $v \notin [M]$.

$(b)$ Let $V$ be a vector space, and let $E ⊆ V$, $v ∈ V\setminus E$, and $[E∪{v}] = V$ Then: $[E] = V$ if and only if $v ∈ [E]$.

For the first example I understand that if v was an element of $[M]$ there would be two of the sets ${v}$ in the union and they would be linearly dependent because we could subtract them from each other so the condition for linear independence wouldn't be met.

2

There are 2 best solutions below

0
On BEST ANSWER

You are correct about part $a$. For the first direction, if $M\cup \{v\}$ is linearly independent, then $v\notin \text{span(M)}$(it is more common to write span$(A)$ to denote the set of all linear combinations of the elements of $A$ rather than square brackets) as if that were the case, then we could write $v$ as a linear combination of the elements of $M$ and thus $M\cup \{v\}$ won't be linearly independent.

For the other direction, suppose that $v\notin \text{span}(M)$(this also implies that $v\ne 0$). Consider the elements $v_1,\dots,v_k,v$ of $M\cup \{v\}$. Let $a_1,\dots,a_{k+1}$ be scalars such that $$a_1v_1+\dots+a_{k+1}v=0$$ Because $v$ is not in the span of $M$ and the vectors $v_1,\dots,v_k$ are linearly independent, we have that $a_1=\dots=a_{k+1}=0$ and thus $M\cup \{v\}$ is linearly independent.


You should try to prove part $b$ by yourself. First suppose that $\text{span}(E)=V$, then conclude that $v\in \text{span}(E)$.

Then suppose that $v\in \text{span}(E)$. Use the given hypothesis that $\text{span}(E\cup \{v\})=V$ to conclude that $\text{span}(E)=V$.

0
On

(a) Assume $v\in [M]$. Then we can write $v=\sum_{i=1}^k a_iv_i$ for (distinct) $v_1, \ldots, v_k$. Note that $v_1, \ldots, v_k$ are all not equal to $v$, since $v\notin M$. Then $v,v_1, \ldots, v_k$ are distinct members of $M\cup \{v\}$ and $$(-1)v+\sum_{i=1}^k a_iv_i=0.$$ Therefore we have $0$ as a linear combination of distinct members of $M\cup \{v\}$ such that not all coefficients are $0$. This means $M\cup \{v\}$ is not linearly independent.

Assume $M\cup \{v\}$ is not linearly independent. Then there exist distinct $v_1, \ldots, v_k\in M\cup \{v\}$ and coefficients $a_1, \ldots, a_k$ such that at least one of the $a_i$ is non-zero and $$0=\sum_{i=1}^k a_iv_i.$$ Without loss of generality, we can assume that for each $i$, $a_i\neq 0$, because we can simply omit $v_i$ whenever $a_i=0$. We note that $v\in \{v_1, \ldots, v_k\}$. Indeed, if $v\notin \{v_1, \ldots, v_k\}$, then $\{v_1, \ldots, v_k\}\subset M$, each $a_i$ is non-zero, and $$0=\sum_{i=1}^k a_iv_i.$$ This would contradict linear independence of $M$. Therefore it follows that $v\in \{v_1, \ldots, v_k\}$. Assume that $v=v_m$. Since $a_m\neq 0$ (recall that we assumed each $a_i\neq 0$), and since $$0=\sum_{i=1}^k a_iv_i,$$ we have that $$v=v_m=-a_m^{-1}\sum_{m\neq i=1}^k a_iv_i\in [M].$$ Here we use the fact that $v_1, \ldots, v_k$ are distinct to deduce that since $v=v_m$, $v_i\neq v$ for each $i\neq m$, so $v_1, \ldots, v_{m-1}, v_{m+1}, \ldots, v_k\in M$.

(b) If $v\notin [E]$, then $[E]\neq V$, since $v\in V\setminus [E]$. On the other hand, if $v\in [E]$, then for any $u\in V=[E\cup \{v\}]$, we can write $u=\sum_{i=1}^k a_iv_i$ for some distinct $v_1, \ldots, v_k\in E\cup \{v\}$. Without loss of generality, we can assume that $v\in \{v_1, \ldots, v_k\}$. This is because if $v\notin \{v_1, \ldots, v_k\}$, we can let $a_{k+1}=0$ and $v_{k+1}=v$ and note that $u=\sum_{i=1}^k a_iv_i=\sum_{i=1}^{k+1}a_iv_i$. Therefore we can assume $v\in \{v_1, \ldots, v_k\}$. By changing the order of the terms, we can assume $v=v_k$. Since $v\in [E]$, we can write $v=\sum_{i=1}^l b_iv_i'$ for some $v_1', \ldots, v_l'\in E$. Then $$u=\sum_{i=1}^k a_iv_i=\sum_{i=1}^{k-1}a_iv_i + a_kv_k=\sum_{i=1}^{k-1}a_iv_i + \sum_{i=1}^l a_kb_iv_i'\in [E].$$