$Ax = \vec b$
I need to brush up my understanding about linear combination, linear independence, linear dependent,
so I have search and read about it
here $\vec b$ is linear combination of $A$
I found this question:
$\begin{bmatrix}1&a&b\\1&a^2&b^2\\1&a^3&b^3\\\end{bmatrix}$$\begin{bmatrix} x\\ y\\z\\\end{bmatrix}$=$\begin{bmatrix} 1\\ 2\\3\end{bmatrix}$
$Ax = \vec b$
find condition $a,b$ in order the system only has one solution. $a$ and $b $different number.
what I try to understand:
set of column A is linear independence means that suppose $\vec b$ is $\begin{bmatrix} 0\\ 0\\0\\\end{bmatrix}$
so the question is , is there $\vec x$ that is all $\begin{bmatrix} 0\\ 0\\0\\\end{bmatrix}$ or not? right?
if it is only $\begin{bmatrix} 0\\ 0\\0\\\end{bmatrix}$ that satisfy then it is linearly independent, if not linearly dependent.
$A=\begin{bmatrix}1&a&b\\1&a^2&b^2\\1&a^3&b^3\\\end{bmatrix}$$\begin{bmatrix} x\\ y\\z\\\end{bmatrix}$=$\begin{bmatrix} 1\\ 2\\3\end{bmatrix}$
$Ax = \vec b$
but in this question, is $\begin{bmatrix} 1\\ 2\\3\end{bmatrix}$ means linear combination of $A=\begin{bmatrix}1&a&b\\1&a^2&b^2\\1&a^3&b^3\\\end{bmatrix}$?
and there are possibility that it has one solution, no solution, infinitely many solution?
if it has one solution means the column in matrix A is linear independent, there is only one $x,y,z$ as solution ?
if it has many solutions, means the column in matrix A is linearly dependent, we can express $x$ as combination of $y,z$ for example?
so it is imply that if it is linear dependence and linear independent $ \vec b$ is linear combination of matrix $A$ , am I right?
if there is no solution, or inconsistent, then $\vec b$ is not linear combination of matrix A ?
if I was asked about linear combination, this is mean I need to find $\vec b$? if I was asked about solution then I need to find $\vec x$ right? thanks!!!!
I always prefer introducing linear independence without the use of matrices, personally. Particularly, this is because linear independence is also an important notion outside of finite dimensions.
For example, in the infinite vector space of functions from $\mathbb{R}$ to $\mathbb{R}$, the set $\lbrace \sin^2, \cos^2, 1\rbrace$ is linearly dependent, as, $$\sin^2 + \cos^2 - 1 = 0,$$ where $1$ and $0$ represent constant functions taking the value $1$ and $0$ respectively.
In the context of column vectors of matrices, it's helpful to remember that matrix multiplication on the right by a column vector simply takes a linear combination of the column vectors. In our example, $$\begin{pmatrix} 1 & a & b \\ 1 & a^2 & b^2 \\ 1 & a^3 & b^3 \end{pmatrix}\begin{pmatrix} x \\ y \\ z \end{pmatrix} = x\begin{pmatrix} 1 \\ 1 \\ 1 \end{pmatrix} + y\begin{pmatrix} a \\ a^2 \\ a^3 \end{pmatrix} + z\begin{pmatrix} b \\ b^2 \\ b^3 \end{pmatrix}.$$ This means that a bunch of theorems about linear independence apply to matrix matrix equations $Ax = b$:
I hope some of this helps.