Classifying the subspace complements of the $x$-axis in the Cartesian plane

529 Views Asked by At

I am self-studying through the text $\textit{A (Terse) Introduction to Linear Algebra}$ by Katznelson and got stuck on the following exercise:

$\textbf{ex1.2.8}$ Describe all the complements in $\mathbb{R}^2$ of the subspace $X = \{(x,0) : x \in \mathbb{R}\}.$

The book says that for an $F$-vector space $V$, and two $F$-subspaces $W$ and $Z$ of $V$, $W$ and $Z$ are $\textbf{independent}$ if for all $w \in W$ and $z \in Z$, $w + z = 0_V$ implies $w = 0_V = z$. Further, two subspaces $W$ and $Z$ are $\textbf{complementary}$ if they are independent and $W + Z = V$, in which case we write $W \oplus Z$. No other properties of complementary subspaces are introduced. For example, the fact that complementary subspaces intersect only at $0_V$ is left for the exercise directly after this one, and so I presume only the definition is needed for this problem. My attempt is to say that the only complementary subspaces of $X$ are $Y = \{(0,y) : y \in \mathbb{R}\}$ and $L_m = \{(a, ma): a \in \mathbb{R}\}$ for each fixed $m \in \mathbb{R} - \{0\}$. That is, the only complementary subspaces are the non-horizontal lines that pass through the origin (so that they contain $0_{\mathbb{R}^2}$).

Indeed, it is clear that $Y$ is a subspace of $\mathbb{R}^2$ and that $X \oplus Y = \mathbb{R}^2$. Let $m \in \mathbb{R}$ be nonzero, consider $L_m$. Then $0_{\mathbb{R}^2} = (0,0) = (0,m0) \in L_m$. If $(a,ma), (b,mb) \in L_m$ and $e \in \mathbb{R}$, then $$(a,ma) + e(b,mb) = (a + eb, ma + meb) = (a + eb, m(a+eb)) \in L_m$$ so that $L_m$ is a subspace of $\mathbb{R}^2$. Let $(a,b) \in \mathbb{R}^2$. Then $$(a,b) = \left(a - \frac{b}{m}\right) +\left(\frac{b}{m}, m\frac{b}{m}\right) \in X + L_m$$ so $\mathbb{R}^2 = X \oplus L_m$. Therefore, any non-horizontal line passing through the origin is complementary to the $x$-axis. My trouble is showing that these are the only ones. Suppose $L$ is a complementary subspace of $X$. If $L = Y$, we are done. Suppose not, so that there is $(e,f) \in L$ with $e \neq 0$. Consider $m = \frac{f}{e}$. Then I think we should get that $L = L+m$, but I don't exactly know how to show that. For example, if $(a,b) \in L$, then we would like to show that $b = ma$, and I presume using independence somehow, but don't quite see the way. Any minimal help would be greatly appreciated.

1

There are 1 best solutions below

0
On

Let $(a,b) \in L$, $(a,b) \neq (0,)$. Since $L$ is a subspace the entire line $L_0$ containing $(0,0)$ and $(a,b)$ is inside $L$. If there is any point, say $(c,d)$ outside this line in $L$ the $(a,b)$ and $(c,d)$ are independent vectors in $L$ since they are not scalar multiples of each other. But then the dimension of $L$ is at least 2 and the dimension of $K+L$ is at least 3. Since we cannot have a subspace of dimension 3 inside the two dimensional space $\mathbb R^{2}$ we conclude that $L$ cannot have any point outside the line $L_0$, so $L=L_0$