Given the linearly independent set $\{a_1,a_2,...,a_k\}$ prove that $\{b,a_1,a_2,...,a_k\}$ is also linearly independent.

212 Views Asked by At

Given the linearly independent set $\{a_1,a_2,...,a_k\}, k\in \Bbb N$ in vector space $V$ and the vector $b\in V\setminus[\{a_1,a_2,...a_k\}]$, prove that the set $\{b,a_1,a_2,...,a_k\}$ is also linearly independent.

I don't know even where to start, should I assume the opposite, that $\{b,a_1,a_2,...,a_k\}$ is linearly dependent and then get the contradiction in the end?

If $\{b,a_1,a_2,...,a_k\}$ is linearly dependent, that means

$\alpha_0b+\alpha_1a_1+...\alpha_ka_k=0 \Rightarrow$ at least one of the scalars is not $0$.

Is this the proper way to start? If so, where do I go from here?

2

There are 2 best solutions below

2
On BEST ANSWER

$$\alpha_0b+\alpha_1a_1+...\alpha_ka_k=0 \\ \alpha_1a_1+...\alpha_ka_k=-\alpha_0 b $$But $b$ is not in the span of the $a_i$, so we must have $\alpha_0=0$. The remaining $\alpha_i$ must then also be $0$ because the $a_i$ are linearly independent.

2
On

Suppose that there is no linear independency.

Then $(\lambda,\lambda_1,\dots,\lambda_k)\neq(0,0,\dots,0)$ exists with: $$\lambda b+\lambda_1 a_1+\cdots+\lambda_k a_k=0$$

The assumption $\lambda=0$ leads to the conclusion that $a_1,\dots, a_k$ are not linearly independent, so must be rejected.

But the assumption $\lambda\neq0$ leads to the conclusion that $b\in[\{a_1,\dots,a_k\}]$ so must also be rejected.

A contradiction has been found now.