I showed that $AX = b_1$ has a unique solution. Does it mean that the columns of $A$ are linearly independent?

350 Views Asked by At

I know that if $AX = b_1$ has infinite possible solutions, then $AX = b_1$ $\iff A(X_{inh} + X_{hom}) = b_1$ $\implies "AX_{hom} = 0 $ has infinite possible solutions", thus the columns of $A$ are linearly dependent.

I know I can't imply "$AX_{hom} = 0$ has 1 unique solution $(X_{hom} = 0)$" from "$AX = b_1$ has 1 unique solution", since I only showed it for $b_1$ but not for all all $b \in \mathbb{R}^n$. In other words, what if there is a $b_i$, $i\neq1$, such that $AX_{hom}=0$ has infinite possible solutions?

Intuitively I feel like no such $b_i$ exists. Therefore "$AX = b_1$ has a unique solution" implies "the columns of $A$ are linearly independent". But how do I show it?

3

There are 3 best solutions below

4
On BEST ANSWER

Yes, you can rather work for the contrapositive statement. Suppose columns of $A$ are linearly dependent then $rank(A)<n$, where $n$ is number of columns in $A$ or equivalently number of variables in the system of equations furnished by $AX=b_1$. Now $rank(A)<n$ implies either infinitely many solutions or no solution i.e. non-uniqueness in either case.

0
On

Sure, you can easily show that by assuming that you have two solutions $X_1\ne X_2$, that is, $AX_1=AX_2$. Hence, $A(X_1-X_2)=0$. But if the columns of $A$ are independent, this means that $X_1-X_2$ must be zero and, therefore, $X_1=X_2$.

0
On

To be more explicit, since we have proved that:

If $A\underline{X}=\underline{b}_1$ has a unique solution,

then $A\underline{X}=\underline{0}$ iff $\underline{X}=\underline{0}$.

We can proceed as follows:

Let $$A= \left (\begin{matrix} \underline{a_1} & \underline{a_2} & \cdots & \underline{a_n} \\ \end{matrix} \right) $$ where $\underline{a_1}, \underline{a_2}, \cdots, \underline{a_n}$ are the column vectors of $A$ and $$\underline{X}= \left( \begin{matrix} x_1 \\ x_2 \\ \vdots \\ x_n \\ \end{matrix} \right)$$

Then $$x_1\underline{a_1}+x_2\underline{a_2}+\cdots+x_n\underline{a_n}=\underline{0}$$

$$\implies \left (\begin{matrix} \underline{a_1} & \underline{a_2} & \cdots & \underline{a_n} \\ \end{matrix} \right)\left( \begin{matrix} x_1 \\ x_2 \\ \vdots \\ x_n \\ \end{matrix} \right)=\underline{0}$$

$$\implies A\underline{X}=\underline{0}$$

$$\implies \underline{X}=\underline{0}$$

$$\implies x_1=x_2=\cdots=x_n=0$$

Hence $\underline{a_1}, \underline{a_2}, \cdots, \underline{a_n}$ are linearly independent.