If a linear combination of vectors is zero and all linear combination formed by coefficients raised to some positive integer power are also zero then what properties can we infer about the coefficients from this.
$a_0 \vec{v_0}+a_1 \vec{v_1}+a_2 \vec{v_2}+ \cdots + a_i \vec{v_i} + \cdots + a_n \vec{v_n}=\vec{0}$
$a_0^2 \vec{v_0}+a_1^2 \vec{v_1}+a_2^2 \vec{v_2}+ \cdots + a_i^2 \vec{v_i} + \cdots + a_n^2 \vec{v_n}=\vec{0}$
...
$a_0^k \vec{v_0}+a_1^k \vec{v_1}+a_2^k \vec{v_2}+ \cdots + a_i^k \vec{v_i} + \cdots + a_n^k \vec{v_n}=\vec{0}$
This is true for all positive integer k
For example we can say $a_i=a_j$ for all non zero $a_i,a_j$ otherwise if there was one coefficient which was bigger than all other coefficients then it would just dominate the sum for very large k, so the sum would be non zero.
My main problem is how we can systematically derive such properties. If $\vec{v}$ were scalars we could have made a matrix. What can we do this in this case.
Also If anyone could tell me how to write
if there was one coefficient which was bigger than all other coefficients then it would just dominate the sum for very large k, so the sum would be non zero.
in proper mathematical language or give another proof for $a_i=a_j$ . I would appreciate that too.
These equations can be written as product of matrix form as follows. $VA=0$ where the jth column of $V$ is $v_j$ and $A$ is the product of a diagonal matrix times a Vandermonde matrix. If we have $v_0=[3,3,3], v_1=[4,5,6] ,v_2=[-7,-8,-9]$ then it would look like this $\begin{bmatrix} 3 & 4 & -7\\ 3 & 5 & -8\\ 3 & 6 & -9\\ \end{bmatrix} \begin{bmatrix} a_0 & a_0^2 & a_0^3 &a_0^4 ...\\ a_1 & a_1^2 & a_1^3 &a_0^4 ...\\ a_2 & a_2^2 & a_2^3 &a_2^4 ...\\ \end{bmatrix}=0$
After this we can use determinant and other things to derive properties.Somehow (I would edit the answer when I figure it out)
Thanks to user8675309 for the comment