Basis and solution of System of equation

194 Views Asked by At

$V$ be a n-dimensional vector space over the field $F$, with fixed Basis $\{\ \alpha_1, ...\alpha_n\}$ . A system of linear equitation - $$a_{11}x_1+a_{12}x_1+a_{11}x_2 \cdots +a_{1n}x_n=0$$ $$a_{21}x_1+a_{22}x_1+a_{11}x_2 \cdots +a_{2n}x_n=0$$ $$\cdots$$ $$a_{k1}x_1+a_{k2}x_1+a_{11}x_2 \cdots +a_{kn}x_n=0$$

is independent if and only if the collection of vectors $$v_1=\sum a_{1j}\alpha_j, v_2=\sum a_{2j}\alpha_j, \cdots v_k=\sum a_{kj}\alpha_j$$

in $V$ are independent.

So,if I am not wrong, elements of basis is solution to the linear system, since $v_i$ is the solution and it is the linear combination of coefficient $a_{ij}$ and basis $\alpha_j$.

Can any one explain how the set of fixed basis become the solution of linear system?

I am new so please teach me. Thanks.

I got this from a lecture. Please click here

2

There are 2 best solutions below

2
On BEST ANSWER

The system of linear equations are independent if it has exactly one solution. Now treating $a_{i}$ as vectors we can say if these vectors are linearly independent then we have exactly one solution.

Now considering this for three dimension, we see

$a_1 = (a_{11},a_{12},a_{13}),a_2 = (a_{21},a_{22},a_{23}), a_3 = (a_{31},a_{32},a_{33})$

We have the basis here $\{\ \alpha_1,\alpha_2,\alpha_3\}$, we write each of this vectors as

$\alpha_1 = (\alpha_{11},\alpha_{12},\alpha_{13}),\alpha_2 = (\alpha_{21},\alpha_{22},\alpha_{23}), \alpha_3 = (\alpha_{31},\alpha_{32},\alpha_{33})$

(Above all vectors are broken down in their Cartesian components)

$v_1=\sum a_{1j}\alpha_j, v_2=\sum a_{2j}\alpha_j, v_3=\sum a_{3j}\alpha_j$, so

$v_1=((a_{11}\alpha_{11}+a_{12}\alpha_{21}+a_{13}\alpha_{31}), (a_{11}\alpha_{12}+a_{12}\alpha_{22}+a_{13}\alpha_{32}),(a_{11}\alpha_{13}+a_{12}\alpha_{23}+a_{13}\alpha_{33}))$

$v_2=((a_{21}\alpha_{11}+a_{22}\alpha_{21}+a_{23}\alpha_{31}), (a_{21}\alpha_{12}+a_{22}\alpha_{22}+a_{23}\alpha_{32}),(a_{21}\alpha_{13}+a_{22}\alpha_{23}+a_{23}\alpha_{33}))$

$v_3=((a_{31}\alpha_{11}+a_{32}\alpha_{21}+a_{33}\alpha_{31}), (a_{31}\alpha_{12}+a_{32}\alpha_{22}+a_{33}\alpha_{32}),(a_{31}\alpha_{13}+a_{32}\alpha_{23}+a_{33}\alpha_{33}))$

Now when $v_1, v_2,v_3$ are independent, $k_1\overrightarrow{v_1}+k_2\overrightarrow{ v_2}+k_3\overrightarrow{v_3} = 0$, $\textbf{only when}$, $k_1=0,k_2=0,k_3=0$, now using above equations, we get after simplification

$k_1\overrightarrow{ v_1}+k_2\overrightarrow{ v_2}+k_3\overrightarrow{v_3} = \\(k_1a_{11}+k_2a_{21}+k_3a_{31})\overrightarrow{\alpha_1}+(k_1a_{12}+k_2a_{22}+k_3a_{32})\overrightarrow{\alpha_2}+(k_1a_{13}+k_2a_{23}+k_3a_{33})\overrightarrow{\alpha_3}$

Now again repeating the above can be zero $\textbf{only when}$, $k_1=0,k_2=0,k_3=0$, otherwise not. But we also see that as $\alpha_1,\alpha_2,\alpha_3$ are basis vectors we also have (for the above to be true)

$k_1a_{11}+k_2a_{21}+k_3a_{31} = 0$

$k_1a_{12}+k_2a_{22}+k_3a_{32} = 0$

$k_1a_{13}+k_2a_{23}+k_3a_{33} = 0$

Now the above set of equations are true only when $k_1=0,k_2=0,k_3=0$, otherwise not.

We also see that above three equations can be summed up as $k_1\overrightarrow{ a_1}+k_2\overrightarrow{a_2}+k_3\overrightarrow{a_3}=0$

That indicates $a_1,a_2,a_3$ are independent vectors. So as mentioned at the begining when $a_1,a_2,a_3$ are independent vectors we have one unique solution(and you can actually see the solution), so the system of equations are independent.

0
On

Your understanding is incorrect, but that's not too surprising. That definition of independence of systems of linear equations seems needlessly confusing, IMHO. There's no need for an abstract $n$-dimensional vector space $V$, nor is there a need for an abstract basis $(\alpha_1, \ldots, \alpha_n)$.

I think a better (and equivalent) definition would be to say, the system $$a_{11}x_1+a_{12}x_1+a_{13}x_2 \cdots +a_{1n}x_n=0$$ $$a_{21}x_1+a_{22}x_1+a_{23}x_2 \cdots +a_{2n}x_n=0$$ $$\vdots$$ $$a_{k1}x_1+a_{k2}x_1+a_{k3}x_2 \cdots +a_{kn}x_n=0$$ is independent if the row vectors of the coefficient matrix $$\begin{pmatrix} a_{11} & a_{12} & a_{13} & \cdots & a_{1n} \\ a_{21} & a_{22} & a_{23} & \cdots & a_{2n} \\ \vdots & \vdots & \vdots & \ddots & \vdots \\ a_{k1} & a_{k2} & a_{k3} & \cdots & a_{kn} \end{pmatrix}$$ are linearly independent in $F^n$.

Why are they equivalent? Fix the standard basis $(e_1, \ldots, e_n)$ in $F^n$. We can uniquely define a linear transformation $T : V \to F^n$ by making $T\alpha_i = e_i$ for all $i$. Because this linear transformation maps a basis to a basis, it is a bijective linear transformation (as a matter of fact, it is easily seen to be the coordinate vector map for the basis of $\alpha$s). As such, $T$ preserves the property of linear independence.

Tracking the logic, the system of equations is independent if and only if the vectors $$v_l = a_{l1}\alpha_1 + a_{l2}\alpha_2 + \ldots + a_{ln} \alpha_n, \quad l=1, \ldots, k$$ is linear independent. Since $T$ preserves linear independence, and is linear, this set is linearly independent if and only if $$T(v_l) = a_{l1}T(\alpha_1) + a_{l2}T(\alpha_2) + \ldots + a_{ln} T(\alpha_n), \quad l=1, \ldots, k$$ is linear independent. But, $T(\alpha_j) = e_j$, so \begin{align*} T(v_l) &= a_{l1}(1, 0, \ldots, 0) + a_{l2}(0, 1, \ldots, 0) + \ldots + a_{ln} (0, 0, \ldots, 1) \\ &= (a_{l1}, a_{l2}, \ldots, a_{ln}), \end{align*} which is the $l$th row of the coefficient matrix. Hence, the system is independent if and only if the row vectors of the coefficient matrix are linearly independent in $F^n$.

Note how much simpler this definition is! There's really no need for an abstract $n$-dimensional vector space $V$ over $F$, or an abstract basis. Any set of $n$ linearly independent vectors will do, from any space!

So, to address your actual question, the basis of $\alpha$s will not be a solution to the system, simply because they may be really abstract vectors over $F$. The actual elements of $V$ (including the basis of $\alpha$s) matter very little to this definition.