Matrix and finding basis for the column space

77 Views Asked by At

Given a matrix A=

1   2   2
3   1   4
4   3   6

find a basis for the column space of A. Then express the 3rd column of A in that basis.

My first thought is to use RREF and than get the matrix from the augmented matrix

1 0 1.2 |0
0 1 .4  |0
0 0  0  |0 

from here I get confused/most likely wrong. Since leading 1s are in row 1 and 2 the basis of this matrix would than be vector {[1, 0, 1.2] and [0, 1, .4]} with X1=-1.2X3 and X2=-.4X3. With X3 being arbitrary. If I made a mistake anywhere let me know, I do not know how to finish this problem.

2

There are 2 best solutions below

1
On BEST ANSWER

Hint: $$ \frac{6}{5} \begin{bmatrix} 1\\3\\4 \end{bmatrix} + \frac{2}{5} \begin{bmatrix} 2\\1\\3 \end{bmatrix}= \begin{bmatrix} 2\\4\\6 \end{bmatrix} $$ so the first two columns are a basis for the column space and this is the expression of the third column in such basis.

0
On

To find a basis for the column space of $A$, you can certainly apply row reduction, but I find it easier to column-reduce the matrix instead. The non-zero columns of the resulting matrix are a basis for the column space. In this case, after column reduction we end up with $$\pmatrix{1&0&0\\0&1&0\\1&1&0},$$ so a basis for the column space is $$\left\{\pmatrix{1\\0\\1},\pmatrix{0\\1\\1}\right\}.$$ The second part of the problem should be easy once you have that.