Linear independence of rows in matrix

313 Views Asked by At

Let $A = (a_{ij})$,$B=(b_{i,j}) \in M_n$, and matrice $C \in M_{n,2n}$ formed by joining the matrices $A$ and $B$ like this:

$$C= \begin{pmatrix} a_{11} & \cdots & a_{1n} & b_{11} & \cdots & b_{1n}\\ \vdots & & \vdots &\vdots & & \vdots\\ a_{n1} & \cdots & a_{nn} & b_{n1} & \cdots & b_{nn} \\ \end{pmatrix}. $$

Prove whether the following implication true or false:

The rows of $A$ are linearly independent $\implies$ the rows of $C$ are linearly independent.

I think this implication is true, but I don't know how to prove it. If the rows are independent, then no matter which columns you add to that matrice, the rows will remain independent, but like, how to prove that?

2

There are 2 best solutions below

0
On BEST ANSWER

Prove the contrapositive: "the rows of $C$ are linearly dependent implies that the rows of $A$ are linearly dependent"

But this is obvious from the very definition of linear dependence: the rows of $C$ are linearly dependent, $C$ has a row which is a linear combination of the other rows of $C$. Since $C = [A \mid B]$, so $A$ has a row which is a linear combination of the other rows of $A$. Therefore, the rows of $A$ are linearly dependent.

0
On

Take scalars $\lambda_1,\ldots,\lambda_n$ such that$$\lambda_1(a_{11},\ldots,a_{1n},b_{11},\ldots,b_{1n})+\cdots+\lambda_n(a_{n1},\ldots,a_{nn},b_{n1},\ldots,b_{nn})=0.$$Then, in particular,$$\lambda_1(a_{11},\ldots,a_{1n})+\cdots+\lambda_n(a_{n1},\ldots,a_{nn})=0.$$Since the rows of $A$ are linearly independent, every $\lambda_i$ is equal to $0$.