Prove that if $\mathcal{B}_{1}\cup\mathcal{B}_{2}$ is a basis for $V$, then $V = W_{1}\oplus W_{2}$.

963 Views Asked by At

(a) Let $W_{1}$ and $W_{2}$ be subspaces of a vector space $V$ such that $V = W_{1}\oplus W_{2}$. If $\mathcal{B}_{1}$ and $\mathcal{B}_{2}$ are bases for $W_{1}$ and $W_{2}$, respectively, show that $\mathcal{B}_{1}\cap\mathcal{B}_{2} = \varnothing$ and $\mathcal{B}_{1}\cup\mathcal{B}_{2}$ is a basis for $V$.

(b) Conversely, let $\mathcal{B}_{1}$ and $\mathcal{B}_{2}$ be disjoint bases for subspaces $W_{1}$ and $W_{2}$, respectively, of a vector space $V$. Prove that if $\mathcal{B}_{1}\cup\mathcal{B}_{2}$ is a basis for $V$, then $V = W_{1}\oplus W_{2}$.

MY ATTEMPT

(a) Let $\mathcal{B}_{1} = \{\alpha_{1},\alpha_{2},\ldots,\alpha_{m}\}$ and $\mathcal{B}_{2} = \{\beta_{1},\beta_{2},\ldots,\beta_{n}\}$ where $\dim W_{1} = m$ and $\dim W_{2} = n$.

Let $v\in V = W_{1}\oplus W_{2}$. Then $v = w_{1} + w_{2}$ where $w_{1}\in W_{1}$ and $w_{2}\in W_{2}$.

Consequently, there are scalars $a_{1},a_{2},\ldots,a_{m}$ and $b_{1},b_{2},\ldots,b_{n}$ such that \begin{align*} \begin{cases} w_{1} = a_{1}\alpha_{1} + a_{2}\alpha_{2} + \ldots + a_{m}\alpha_{m}\\\\ w_{2} = b_{1}\beta_{1} + b_{2}\beta_{2} + \ldots + b_{n}\beta_{n} \end{cases} \end{align*} Thence we conclude that \begin{align*} v = w_{1} + w_{2} = a_{1}\alpha_{1} + a_{2}\alpha_{2} + \ldots + a_{m}\alpha_{m} + b_{1}\beta_{1} + b_{2}\beta_{2} + \ldots + b_{n}\beta_{n} \end{align*}

Thus $\mathcal{B}_{1}\cup\mathcal{B}_{2}$ spans $V$. Besides that, $\mathcal{B}_{1}\cap\mathcal{B}_{2} = \varnothing$. Indeed, if it were not the case, we would have $b\in\mathcal{B}_{1}\cap\mathcal{B}_{2}\subseteq W_{1}\cap W_{2}$ such that $b\neq 0$, which contradicts the fact that $W_{1}\cap W_{2} = \{0\}$.

Finally, let us prove that $\mathcal{B}_{1}\cup\mathcal{B}_{2}$ is LI. Indeed, if

\begin{align*} c_{1}\alpha_{1} + c_{2}\alpha_{2} + \ldots + c_{m}\alpha_{m} + d_{1}\beta_{1} + d_{2}\beta_{2} + \ldots + d_{n}\beta_{n} = 0 \end{align*} then we should have \begin{align*} c_{1}\alpha_{1} + c_{2}\alpha_{2} + \ldots + c_{m}\alpha_{m} = -d_{1}\beta_{1} - d_{2}\beta_{2} - \ldots - d_{n}\beta_{n} \end{align*} which implies that \begin{align*} c_{1}\alpha_{1} + c_{2}\alpha_{2} + \ldots + c_{m}\alpha_{m}\in W_{1}\cap W_{2} = \{0\} \end{align*}

whence we conclude that $c_{1} = c_{2} = \ldots = c_{m} = 0$. Similar reasoning shows that $d_{1} = d_{2} = \ldots = d_{n} = 0$, and the result holds.

(b) Based on the same notation as previously established, let $v\in V$. According to the given assumptions, there are scalars $a_{1},a_{2},\ldots,a_{m}$ as well as $b_{1},b_{2},\ldots,b_{n}$ such that \begin{align*} v = a_{1}\alpha_{1} + a_{2}\alpha_{2} + \ldots + a_{m}\alpha_{m} + b_{1}\beta_{1} + b_{2}\beta_{2} + \ldots + b_{n}\beta_{n} = w_{1} + w_{2} \end{align*} where $w_{1}\in W_{1}$ and $w_{2}\in W_{2}$. Thus $V = W_{1}+W_{2}$. It remains to prove that $W_{1}\cap W_{2} = \{0\}$.

Let us assume that $w\in W_{1}\cap W_{2}$. Then we conclude that \begin{align*} w = c_{1}\alpha_{1} + c_{2}\alpha_{2} + \ldots + c_{m}\alpha_{m} = d_{1}\beta_{1} + d_{2}\beta_{2} + \ldots + d_{n}\beta_{n} \end{align*} Rearranging this relation, it results that \begin{align*} c_{1}\alpha_{1} + c_{2}\alpha_{2} + \ldots + c_{m}\alpha_{m} - d_{1}\beta_{1} - d_{2}\beta_{2} - \ldots - d_{n}\beta_{n} = 0 \end{align*} thence $a_{1} = a_{2} = \ldots = a_{m} = b_{1} = b_{2} = \ldots = b_{n} = 0$, and we are done.

Are the provided proofs correct? Is there a neater way to rephrase my arguments? Any comments are appreciated.

2

There are 2 best solutions below

0
On BEST ANSWER

Another approach without appealing dimensionality.

  • Suppose that $\mathcal B_1$ is a basis for $W_1$ and that $\mathcal B_2$ is a basis for $W_2$. Since $V = W_1 \oplus W_2$, any vector in $V$ can be uniquely written as a sum of a vector in $W_1$ and a vector in $W_2$; but also, at the same time, every vector in $W_i$ can be uniquely written as a linear combination of vectors in $\mathcal B_i$. In conclusion, every vector in $V$ can be expressed in only one way as a linear combination of vectors in $\mathcal B_1 \cup \mathcal B_2$, and hence, $\mathcal B_1 \cup \mathcal B_2$ is a basis for $V$. Your explanation of why $\mathcal B_1 \cap \mathcal B_2 = \varnothing$ is fine.

  • Now, suppose that $\mathcal B_1$ and $\mathcal B_2$ are two disjoint basis, the first one for $W_1$ and the second one for $W_2$. If $\mathcal B_1 \cup \mathcal B_2$ is a basis for $V$, the same argument above works to show that $V = W_1 \oplus W_2$, just be careful.

0
On

I prefer denoting a basis as a list and not a set.

(a) Let $B_{1}= (v_{i}:i\in I), B_{2}= (u_{j}:j\in J)$ (where $I, J$ are arbitrary index sets).

By hypothesis $V = W_{1} + W_{2}, W_{1}\cap W_{2}=\{0\}$. Suppose $B_{1}\cap B_{2} \neq \emptyset,$ then $\exists x\in B_{1}\cap B_{2},$ then $x\neq 0$ (since $B_{1}, B_{2}$ are l.i.). Thus $W_{1}\cap W_{2}\neq \{0\}$.

Additionally, we have $B_{1}\cup B_{2}=(v_{i}, u_{j}: i\in I, j\in J)$, then let's consider an arbitrary finite null combination of the elements of $B_{1}\cup B_{2}$ as $\sum\alpha_{i}v_{i} + \sum\beta_{j}u_{j} = 0,$ then by hypothesis $\sum\alpha_{i}v_{i} = \sum\beta_{j}u_{j} = 0$. So, $\alpha_{i} = \beta_{j}=0, \forall i,j$. This proves the linear independence of $B_{1}\cup B_{2}=(v_{i}, u_{j}: i\in I, j\in J)$. And it is trivial the spanning property.

(b) Let $v\in V$, then $v = \sum_{finite}\alpha_{i}v_{i} + \sum_{finite}\beta_{j}u_{j}$ (since $B_{1}\cup B_{2}=(v_{i}, u_{j}: i\in I, j\in J)$ is basis of $V$). Thus $v\in W_{1} + W_{2}$.

Suppose $\exists x\in W_{1}\cap W_{2}: x\neq 0$, then $x = \sum_{finite}\alpha_{i}v_{i} = \sum_{finite}\beta_{j}u_{j},$ for some $\alpha_{i}'s,\beta_{j}'s$. Then $\sum_{finite}\alpha_{i}v_{i} + \sum_{finite}-\beta_{j}u_{j} = 0$, thus $\alpha_{i} = \beta_{j}=0, \forall i,j$ which proves $W_{1}\cap W_{2} = \{0\}$