Here is a statement that seems prima facie obvious, but when I try to prove it, I am lost.
Let $x_1 , x_2, \dots, x_k$ be complex numbers satisfying:
$$x_1 + x_2+ \dots + x_k = 0$$
$$x_1^2 + x_2^2+ \dots + x_k^2 = 0$$
$$x_1^3 + x_2^3+ \dots + x_k^3 = 0$$
$$\dots$$
Then $x_1 = x_2 = \dots = x_k = 0$.
The statement seems obvious because we have more than $k$ constraints (constraints that are in some sense, "independent") on $k$ variables, so they should determine the variables uniquely. But my attempts so far of formalizing this intuition have failed. So, how do you prove this statement? Is there a generalization of my intuition?
For a slightly different method than Potato's second answer (but the idea is mainly the same):
Without loss of generality, the system of equations can be written as: $$\left\{ \begin{array}{lcl} \lambda_1x_1 + \lambda_2x_2+ \dots + \lambda_k x_k &= &0 \\ \lambda_1x_1^2 + \lambda_2x_2^2+ \dots + \lambda_k x_k^2 & = & 0 \\ & \vdots & \\ \lambda_1x_1^k + \lambda_2x_2^k+ \dots + \lambda_k x_k^k & = & 0 \end{array} \right.$$
where $\lambda_i>0$, $x_i \neq 0$ and $x_i \neq x_j$ for $i \neq j$. Indeed, if $x_i=x_j$ replace $x_i+x_j$ with $2x_i$ and if $x_i=0$ just remove it. By contradiction, suppose $k \geq 1$.
Now, the family $\{ (\lambda_1 x_1^j, \dots , \lambda_k x_k^j) \mid 1 \leq j \leq k \}$ cannot be linearly independent since the vector space $\{(y_1,\dots, y_k) \mid y_1+ \dots+ y_k=0 \}$ has dimension $k-1$ (it is a hyperplane). Therefore, the matrix
$$A:=\left( \begin{matrix} \lambda_1x_1 & \lambda_2x_2 & \dots & \lambda_k x_k \\ \lambda_1x_1^2 & \lambda_2x_2^2 & \dots & \lambda_k x_k^2 \\ \vdots & \vdots & & \vdots \\ \lambda_1x_1^k & \lambda_2x_2^k & \dots & \lambda_k x_k^k \end{matrix} \right)$$
is not invertible. Using Vandermonde formula, $$0= \det(A)= \prod\limits_{i=1}^k \lambda_i \cdot \prod\limits_{i=1}^k x_i \cdot \prod\limits_{i<j} (x_i-x_j) $$
which is nonzero by assumption.