I've encountered a problem with two approaches, leading to contradictory results.
Approach 1:
Utilizing the definition of linear independence, consider the equation $c_1 \begin{bmatrix} 1 \\ 0 \end{bmatrix} + c_2 \begin{bmatrix} i \\ 0 \end{bmatrix} = \begin{bmatrix} 0 \\ 0 \end{bmatrix}$. This simplifies to $(c_1 + c_2i, 0) = (0, 0)$, implying $c_1 + c_2i = 0$. Solving yields $c_1 = c_2 = 0$, indicating that these vectors are linearly independent.
Approach 2:
Consider the matrix $A = \begin{bmatrix} 1 & i \\ 0 & 0 \end{bmatrix} \begin{bmatrix} c_1 \\ c_2 \end{bmatrix} = 0$. The rank of $A$ is less than two due to the second row being zero. $\text{rank}(A) < 2$ implies that there will be infinite solutions for $c_1, c_2$ other than the trivial one. Therefore, they are linearly dependent.
Which of these approaches is correct?
The vectors are linearly independent. Your first approach is correct.
The second approach doesn't work because the 'infinite amount of solutions' will never be real unless it is the trivial solution $(c_1,c_2) = (0,0)$.