Show that the basis of Col (A) and Null (A) is a basis for $R^n.$

1.9k Views Asked by At

Q: Given an $n\times n$ matrix. Such that $A*A = A.$ Let $\{x_1, \ldots, x_l\}$ be the basis for $\operatorname{Null}(A)$ and $\{b_1, \ldots, b_k\}$ be the basis for $\operatorname{Col}(A).$

Show that $\{x_1, \ldots, x_l,b_1, \ldots, b_k\}$ is a basis for $R_n?$

The rank theorem: $n = \dim \operatorname{Col}(A) + \dim \operatorname{Null}(A).$

So in theory I only have to show that these two basis are linearly independent or that the two bases combined spans $R_n$?

1

There are 1 best solutions below

1
On

One can show that a set of $n$ linearly independent vectors spans $\mathbb{R^n}$. Also, since $\dim(\text{Col}(A)) = k$ and $\dim(\text{Null}(A)) = l$, then by the rank nullity theorem, $n = l+k$. This means that the set of vectors, $\{x_1, \dots, x_l, b_1, \dots, b_k\}$ contains $l+k = n$ vectors. Thus, it suffices to show that the vectors in $\{x_1, \dots, x_l, b_1, \dots, b_k\}$ are linearly independent.

Consider a linear combination of vectors in $\{x_1, \dots, x_l, b_1, \dots, b_k\}$ that sum to zero. We will show the only solution is the trivial one (i.e. all the $c_i$'s and $d_j$'s are $0$). $$c_1x_1+\dots + c_lx_l + d_1b_1 + \dots + d_kb_k = 0.$$

Notice that we can rewrite the following as,

$$c_1x_1+\dots + c_lx_l = -(d_1b_1 + \dots + d_kb_k).$$

Distributing the negative a bit, $$c_1x_1+\dots + c_lx_l = (-d_1)b_1 + \dots + (-d_k)b_k.$$

Let $v = c_1x_1+\dots + c_lx_l$. Notice on the left-hand-side of the above equation, we have that $v$ is a linear combination of vectors from the basis of $\text{Null}(A)$, and thus, $v \in \text{Null}(A)$. Similarly, on right-hand-side, we have that $v$ is also a linear combination of vectors from the basis of $\text{Col}(A)$ and thus $v \in \text{Col}(A)$.

So we have that $v$ is in both the nullspace and the column space of $A$. This means that, $$Av = 0$$ and $$\exists w, Aw = v.$$

Take the second of these two equations and multiply both sides by $A$ on the left, $$AAw = Av.$$

Then we use the fact that $AA = A$ and $Av = 0$, $$Aw = (AA)w = A(Aw) = Av = 0.$$

Since $Aw = 0$ and $Aw = v$, then it must be that $v = 0$.

Going back to how $v$ is defined, $$0 = v = c_1x_1+\dots + c_lx_l.$$

Since $\{x_1, \dots, x_l\}$ is a basis and thus linearly independent, it must be that $c_i = 0$ for $i=1, \dots, l$.

We also have that, $$0 = v = (-d_1)b_1 + \dots + (-d_k)b_k,$$

so similarly, since $\{b_1, \dots, b_k\}$ is a basis and thus linearly independent, it must be that $-d_j = 0 \implies d_j=0$ for $i=j, \dots, k$.

We have thus shown that each of the $c_i$'s and $d_j$'s must be zero and thus, the only linear combination of vectors in $\{x_1, \dots, x_l, b_1, \dots, b_k\}$ that result in the zero vector is the trivial one. Thus, $\{x_1, \dots, x_l, b_1, \dots, b_k\}$ is linearly independent and thus is a basis of $\mathbb{R}^n$.