Show that for two vectors $v,w \in \mathbb{R}^n $, the conditions below are equivalent:
(i) $v \neq 0$, and there exists no $\rho \in \mathbb{R}$ with $w = \rho \cdot v$
(ii) $w \neq 0$, and there exists no $\rho \in \mathbb{R}$ with $v = \rho \cdot w$
(iii) Let $\lambda, \mu \in \mathbb{R}$ with $\lambda v + \mu w = 0$, then it follows necessarily that $\lambda = \mu = 0$
It's obviously about linear independence. I know what linear independence means, two or more vectors cannot span the same space. In other words, the only solution for the system $\alpha \cdot \text{vector1} + \beta \cdot \text{vector2} = 0$ is if $\alpha = \beta = 0$. This seems to be condition (iii).
Linear independence happens when one vector is not a multiple of the other vectors, so that's what conditions (i) and (ii) seem to show.
But do I need to prove that formally somehow ? It makes intuitive sense, but I'm a bit confused about what they expect.
(i) $\implies (ii)$ if $w\neq 0$ and $v=\rho w$ since $\Bbb R$ is a field exist the inverse of $\rho$ namely $\frac{1}{\rho}$ but this lead to a contradiction since $\frac{1}{\rho} \in \Bbb R, w = \frac{1}{\rho} v$ contradicting (i)
(ii) $\implies$ (iii) Similar to what we do before if $\lambda v + \mu w = 0$ than $v = -\frac{\mu}{\lambda}w$, absurd.
I think by now you get the point, now if you prove (iii) $\implies$ (i) you have proven that these three conditions are equivalent.
Edit: IMO proving those type of things could be trivial but are a good exercise to write simple things formally, in order to be able writing much more complicated things later!