Re-checking my linear algebra, using Lax textbook Linear algebra and its applications.
I want to know if my attempt of proving the following theorem it's ok (it has been a while since I attended this lecture).
Show that if the vectors $x_1, x_2,\dots,x_j$ are linearly independent, then none of them is the zero vector.
As far as I understand, this is a proof by contradiction. I have to exhibit a nontrivial linear combination that fulfills
$$ k_1x_1+k_2x_2+\dots+k_jx_j=0, $$
where $k_j\in K$ are elements in the field $K$.
Without loss of generality, suppose that $x_1$ is the zero vector. Then
$$ k_1x_1+\dots+k_jx_j=0, $$
where all the $k_i$ must be zero. We can sum on both sides the term $\tilde{k}x_1$ (where $\tilde{k}\in K$), and by Eq. (10): $0x_1=0$. Therefore
$$ (\tilde{k}+k_1)x_1+\dots+k_jx_j=\tilde{k}x_1+\dots+k_jx_j=0, $$
and I have showed that there is a non-trivial linear combination with the zero vector, which is a contradiction.
Unfortunately, as it stands, this is not a good proof.
You don't every introduce your assumption. You are assuming that the set is linearly independent, and that is why you can conclude something about $k_i$.
You don't say what $\overline{k}$ is; you need $\overline{k}$ to be non-zero, or else you've concluded nothing.
The proof is pretty circuitous and comes to the correct conclusion in a very roundabout way.
The key idea is just your last line: exhibit a combination that has a non-zero coefficient. Taking $k_1 = 1$ and $k_i = 0$ for $i > 1$ is good enough. And this gives a proof by contrapositive, which is quite a bit cleaner than the contradiction proof you were going for.