Prove that every element in $V$ is on the form $\vec{v}$

63 Views Asked by At

Let $u_1=\begin{pmatrix}1\\ 2\\ 3\end{pmatrix}$, $u_2=\begin{pmatrix}2\\ 3\\ 4\end{pmatrix}$, $v_1=\begin{pmatrix}1\\ 1\\ 2\end{pmatrix}$, $v_2=\begin{pmatrix}2\\ 2\\ 3\end{pmatrix}$.

Let $U=span(\vec{u_1},\vec{u_2})$ and $V=span(\vec{v_1},\vec{v_2})$.

I've already shown that $(\vec{u_1}|\vec{u_2})$ and $(\vec{v_1}|\vec{v_2})$ are equivalent to $\begin{pmatrix}1 & 0\\ 0 & 1\\ 0 & 0\\ \end{pmatrix}$ and therefore that $U$ and $V$ are vector spaces of dimension 2.

Now I want to prove the following, but I could use some help (or at least some guidance on where to start):

  1. That any element in $V$ is of the form $\vec{v}=\begin{pmatrix}\alpha + 2\beta\\ \alpha + 2\beta\\ 2\alpha + 3\beta\end{pmatrix}$ where $\alpha$ and $\beta$ are real numbers.

  2. That the following system only has solutions if $\alpha+\beta=0$:

$\begin{pmatrix}1 & 2\\ 2 & 3\\ 3 & 4\end{pmatrix}\vec{x}=\begin{pmatrix}\alpha + 2\beta\\ \alpha + 2\beta\\ 2\alpha + 3\beta\end{pmatrix}$

  1. That $\begin{pmatrix}\alpha + 2\beta\\ \alpha + 2\beta\\ 2\alpha + 3\beta\end{pmatrix}=\beta\begin{pmatrix}1\\ 1\\ 1\end{pmatrix}$ when $\alpha+\beta=0$.

  2. That $U\cap V=span(\vec{v})$, where $\vec{v}=\begin{pmatrix}1 \\ 1\\ 1\\ \end{pmatrix}$.

3

There are 3 best solutions below

2
On BEST ANSWER

1) Since $V=span(\vec v_1,\vec v_2)$, there is nothing to prove, because by definition $$ span(\vec v_1,\vec v_2)=\left\{\alpha\vec v_1+\beta\vec v_2~:~\alpha,\beta\in\mathbb R\right\}=\left\{\begin{pmatrix}\alpha+2\beta\\\alpha+2\beta\\2\alpha+3\beta\end{pmatrix}~:~\alpha,\beta\in\mathbb R\right\}. $$

2) Write $$ \begin{pmatrix}\alpha+2\beta\\\alpha+2\beta\\2\alpha+3\beta\end{pmatrix} =(\alpha+\beta)\begin{pmatrix}1\\1\\2\end{pmatrix}+\beta\begin{pmatrix}1\\1\\1\end{pmatrix} $$ and prove(!) $\begin{pmatrix}1\\1\\1\end{pmatrix}\in U$ and $\begin{pmatrix}1\\1\\2\end{pmatrix}\notin U$.

Not let us show: $\alpha+\beta=0\Rightarrow \begin{pmatrix}\alpha+2\beta\\\alpha+2\beta\\2\alpha+3\beta\end{pmatrix}\in U$.

Let be $\alpha+\beta=0$, then $$ \begin{pmatrix}\alpha+2\beta\\\alpha+2\beta\\2\alpha+3\beta\end{pmatrix} =\beta\begin{pmatrix}1\\1\\1\end{pmatrix}\in U, $$ because ... .

Next, we show: $\alpha+\beta\neq 0\Rightarrow \begin{pmatrix}\alpha+2\beta\\\alpha+2\beta\\2\alpha+3\beta\end{pmatrix}\notin U$.

Assume $\alpha+\beta\neq 0$ and $\begin{pmatrix}\alpha+2\beta\\\alpha+2\beta\\2\alpha+3\beta\end{pmatrix}\in U$. Then $$ (\alpha+\beta)\begin{pmatrix}1\\1\\2\end{pmatrix}=\begin{pmatrix}\alpha+2\beta\\\alpha+2\beta\\2\alpha+3\beta\end{pmatrix}-\beta\begin{pmatrix}1\\1\\1\end{pmatrix}. $$ The LHS is not in $U$ while the RHS ... .

3) Use the form from 2)

4) Combine the results 1)+2)+3).

2
On

To start off, review the definition of spanning set. Since V is the span of the two vectors, any vector in V is a linear combination of the two vectors, then 1. should readily follow. For 2, it should be clear that $3x_1+4x_2=2x_1+3x_2+\alpha + \beta $ for any solutions $x_1$ and $x_2$. Then you can also look at the relationship between the first and second rows of the matrix to get $x_1 $ + $ x_2$ = 0. You can go from there. 3 should follow easily by substitution $0$ for the sum of $\alpha$ and $\beta$. Number 4 is basically the combination of the results from number 2 and number 3. You should definitely think more about these questions on yourself as these kind of examples should be easily found from textbooks.

2
On

HINT

  1. it is trivial since $\dim V=2$ then $v=\alpha v_1+ \beta v_2$

  2. let use RREF

  3. let use $\alpha=-\beta$

  4. it follows by 1,2,3