Showing that v(S), the linear variety generated by subspace S, is a linear variety

18 Views Asked by At

The other question related to this section of the book appears to be unresolved; I just started Luenberger's Optimization by Vector Space Methods and have reached a point of confusion.

Two definitions given:

  • A linear variety $V$ of a subspace $M$ in vector space X is written as $V = x_0 + M$ where $x_0 \in X$.
  • If $S$ is a nonempty subset of $X$, then the linear variety generated by S, or $v(S)$, is the intersection of all linear varieties in $X$ that contain $S$.

I might just be undergoing some intense bout of brain fog, but I am having some trouble understanding how to prove that $v(S)$ is a linear variety.

If I take some vector $a \in v(S)$, then I was thinking I could find some subspace $M \subseteq X$ such that $v(S) = a + M$, which would show that $v(S)$ is a linear variety. What I thought was, take any linear variety $L$ containing $S$, and define $M$ as $L - a$. We want to prove that $M$ is a subspace of $X$.

  • Since $a \in L$ for all $L$, $a - a = 0 \in M$, so $M$ contains the zero vector.
  • Want to show that if $x,y \in M$, then $cx+dy \in M$ for any scalars $c$ and $d$. Let $l_1 = x + a$ and $l_2 = y+a$. Then, $$cx+dy = c(l_1-a) + d(l_1-a) = [\text{something in L}] - a \in M.$$

I'm having trouble understanding what the [something in L] would be. Then, we could say that $L-a$ is a subspace for all $L$, and therefore the intersection of these subspaces is a subspace $v(S) - a$, proving that $v(S)$ is a linear variety.

Any help is appreciated!