Standard definition. Let $v_{1},v_{2},...,v_{n}$ be vectors. If $v_{1},v_{2},...,v_{n}$ are linearly independent, then for all $\alpha _{1},\alpha _{2},...,\alpha _{n}$ such that
$\alpha _{1} x_{1}+...+\alpha _{n} x_{n}=0$
we must have $\alpha _{1}=...=\alpha _{n}=0$.
My definition Let $v_{1},v_{2},...,v_{n}$ be vectors. If $v_{1},v_{2},...,v_{n}$ are linearly independent then there are $\alpha _{1},\alpha _{2},...,\alpha _{n}$ in $\mathbb{R}$ such that
$\alpha _{1} x_{1}+...+\alpha _{n} x_{n}=0$
implies $\alpha _{1}=...=\alpha _{n}=0$.
Is there a difference between standard definition and my definition?
By your definition, $x_1 = (1,0)$ and $x_2 = (2,0)$ (and any set of vectors for that matter) are linearly independent. In particular, we can take $\alpha_1 = \alpha_2 = 0$. For these particular values, the statements $$ \alpha_1x_1 + \alpha_2 x_2 = 0\\ \alpha_1 = \alpha_2 = 0 $$ are both true. So, for these particular values of $\alpha_i$, "$\alpha_1x_1 + \alpha_2 x_2 = 0$" implies "$\alpha_1 = \alpha_2 = 0$", since True $\implies$ True is a true implication.