Prove that a non-zero vector cannot belong to the rowspace and nullspace of a matrix at the same time.

1.4k Views Asked by At

With research, I've mainly found that rowspace is the orthogonal complement of the nullspace, and the only vector that belongs to both spaces at the same time is {0}.

In the linear algebra subject that I'm doing so far, we have not learnt about orthogonality, and we're only expected to use the materials we've been given up until this point of the subject. More specifically, this is the question that we are to answer: enter image description here

What is the proof that the condition given in (1b) is not possible?

2

There are 2 best solutions below

2
On BEST ANSWER

In my opinion, the question is unfairly ambiguous. The rowspace of an $\ m\times n\ $ matrix comprises $\ 1\times n\ $ matrices (i.e. row vectors), so a $\ 3\times 1\ $ column vector such as $\ \mathbf{v}=\begin{bmatrix}1\\2\\3\end{bmatrix}\ $ cannot belong to it.

If, however, you interpret the question as instead asking for a matrix $\ \mathbf{A}\ $ such that:

$ \hspace{3em}\mathbf{v}^{\color{red}\top} \in\text{Row }\mathbf{A} $ and $\ \mathbf{v} \in\text{Nul }\mathbf{A}\ $ at the same time

then you can show this is impossible because it leads to a contradiction: \begin{align} \mathbf{v} \in\text{Nul }\mathbf{A}&\Rightarrow\mathbf{Av}=0\ \ \text{ and}\\ \mathbf{v}^\top \in\text{Row }\mathbf{A} &\Rightarrow \mathbf{v}^\top = \mathbf{x}^\top\mathbf{A}\ \ \text{ for some row vector }\mathbf{x}^\top\\ &\Rightarrow \mathbf{v}^\top \mathbf{v}= \mathbf{x}^\top\mathbf{A} \mathbf{v}=0\ , \end{align} which contradicts the fact that $\ \mathbf{v}^\top \mathbf{v}= 1^2+2^2+3^2=14\ $.

0
On

Solution 1:

Here is an argument that doesn't use orthogonality: if $v$ is a non-zero vector in the row space, you can apply elementary row operations to $A$ until one of its rows is $v$ (for example, if $v = a_1 \times \text{(row 1)} + ... + a_m \times \text{(row m)}$ with $a_i \neq 0$ for some $i$, then multiply row $i$ by the non-zero number $a_i$, and add on $a_k \times \text{(row $k$)}$ for each $k \neq i$. Then the new row $i$ will be the vector $v$). But applying elementary row operations corresponds to multiplying $A$ by some invertible matrices, so this doesn't change the null space. If $v$ is the $i$th row of the transformed matrix (call it $B$), then the $i$th row of $Bv$ is just the sum of the squared components of $v$, which is non-zero since $v$ is nonzero. So $Bv$ is non-zero, i.e. $v$ is not in the null space of $B$, hence it is also not in the null space of $A$.


Solution 2:

Once we have inner products (or the dot product, if you like), we can just use the following fact:

Claim: Let $U$ be a subspace of a vector space $V$. Let $U^\perp$ denote the orthogonal complement of $U$. Then $U \cap U^\perp = \{ 0 \}$.

Proof: Since $U$ and $U^\perp$ are both subspaces, clearly $0 \in U \cap U^\perp$. Hence $\{ 0 \} \subseteq U \cap U^\perp$. Now we show that $U \cap U^\perp \subseteq \{ 0 \}$. Let $v \in U \cap U^\perp$. Then $v$ is orthogonal to itself, i.e. $v \cdot v = 0$, so it must be the zero vector, because all nonzero vectors have positive inner product with themselves (or dot product, if you like). $\square$

This proves your result: since the row space is the orthogonal complement of the null space, they have trivial intersection.