I have an assertion about a sequence of vectors which I have tested on a computer but which I have been unable to prove. The assertion is that when the vectors defined below are of length $N$, then the $N^{\text{th}}$ vector in the sequence must be zero. Can anyone prove it?
The sequnce of vectors is defined as follows. Denote the first vector in the sequence by $\mathbf{v}=\left( v_{1},v_{2},\cdots v_{N},\right) $, where the sum of the $v_{i}$ is zero, i.e. \begin{equation} \sum_{i=1}^{N}v_{i}=0 \end{equation} Then define vectors $\mathbf{u}^{\left( 1\right) }\left( \mathbf{v}\right) $ , $\mathbf{u}^{\left( 2\right) }\left( \mathbf{v}\right) $, $...$recursively as follows. If the elements of $\mathbf{u}^{\left( q\right) }\left( \mathbf{v}\right) $ are $u_{i}^{\left( q\right) }$ for $1\leqslant i\leqslant N$, then for $q=1$ define \begin{equation} u_{i}^{\left( 1\right) }=v_{i} \end{equation} and for $q\geqslant 2$ define \begin{equation} u_{i}^{\left( q\right) }=u_{i}^{\left( q-1\right) }v_{i}-\frac{1}{q}% \sum\limits_{j=1}^{N}v_{j}u_{j}^{\left( q-1\right) } \end{equation} The assertion is that as long as the sum of the $v_{i}$ is zero, then for vectors $\mathbf{v}$ of length $N$, $u_{i}^{\left( N\right) }=0$ for all $i$.
The first few $u_{i}^{\left( q\right) }$ are as follows \begin{equation} \begin{array}{l} u_{i}^{(1)}=v_{i} \\ u_{i}^{(2) }=v_{i}^{2}-\frac{1}{2}s_{2} \\ u_{i}^{(3)}=v_{i}^{3}-\frac{1}{2}v_{i}s_{2}-\frac{1}{3}s_{3} \\ u_{i}^{(4)}=v_{i}^{4}-\frac{1}{2}v_{i}^{2}s_{2}-\frac{1}{3} v_{i}s_{3}-\frac{1}{4}\left( s_{4}-\frac{1}{2}s_{2}^{2}\right) \\ u_{i}^{(5)}=v_{i}^{5}-\frac{1}{2}v_{i}^{3}s_{2}-\frac{1}{3} v_{i}^{2}s_{3}-\frac{1}{4}v_{i}\left( s_{4}-\frac{1}{2}s_{2}^{2}\right) - \frac{1}{5}\left( s_{5}-\frac{5}{6}s_{2}s_{3}\right) \\ u_{i}^{(6)}=v_{i}^{6}-\frac{1}{2}v_{i}^{4}s_{2}-\frac{1}{3}% v_{i}^{3}s_{3}-\frac{1}{4}v_{i}^{2}\left( s_{4}-\frac{1}{2}s_{2}^{2}\right) - \frac{1}{5}v_{i}\left( s_{5}-\frac{5}{6}s_{2}s_{3}\right) -\frac{1}{6}\left( s_{6}-\frac{1}{3}s_{3}^{2}+\frac{1}{8}s_{2}^{3}-\frac{3}{4}s_{2}s_{4}\right) \end{array} \end{equation} where the $s_{q}$ are the power sum symmetric polynomials defined by \begin{equation} s_{q}=\sum_{i}^{N}v_{i}^{q} \end{equation} The assertion is trivial for $N=1$, because if there is only one element in the vector $v_{i}$, then since the sum of $v_{i}$ is zero then the single element $v_{1}$ must be zero.
When $N=2$, then $v_{1}+v_{2}=0$ so $v_{1}=-v_{2}$ and $% s_{2}=2v_{1}^{2}=2v_{2}^{2}$, which again gives $u_{1}^{\left( 2\right) }=u_{2}^{\left( 2\right) }=0$.
When $N=3$, then $v_{1}+v_{2}+v_{3}=0$ so \begin{eqnarray*} s_{3} &=&v_{1}^{3}+v_{2}^{3}+v_{3}^{3} \\ &=&v_{1}^{3}+v_{2}^{3}-\left( v_{1}^{3}+3v_{1}^{2}v_{2}+3v_{1}v_{2}^{2}+v_{2}^{3}\right) \\ &=&3v_{1}v_{2}v_{3} \end{eqnarray*} Then without loss of generality, consider $u_{i}^{\left( 3\right) }$ for $i=1 $ so \begin{eqnarray} u_{1}^{\left( 3\right) } &=&v_{1}^{3}-\frac{1}{2}v_{1}s_{2}-\frac{1}{3}s_{3} \\ &=&v_{1}^{3}-\frac{1}{2}v_{1}\left( v_{1}^{2}+v_{2}^{2}+v_{3}^{2}\right) -v_{1}v_{2}v_{3} \notag \\ &=&\frac{1}{2}v_{1}\left( \left( v_{2}+v_{3}\right) ^{2}-v_{2}^{2}-v_{3}^{2}\right) -v_{1}v_{2}v_{3} \notag \\ &=&0 \end{eqnarray}
Perhaps there's a recursive proof, but if so, I haven't been able to find it.
Note about symmetry: A comment mentioned symmetric polynomials. Although each individual $% u_{i}^{(q)}$ isn't symmetric in all the $\{v_{j}\}$, because of the dependence on $v_{i}$, the vectors $\mathbf{u}^{(q)}$ are symmetric with respect to changing the order of the $v_{j}$, i.e. they satisfy $$\mathbf{M}^{ij}\mathbf{u}^{(q)}\left( \mathbf{v}\right) =\mathbf{u}% ^{(q)}\left( \mathbf{M}^{ij}\mathbf{v}\right)$$ where $\mathbf{M}^{ij}$ is the $N\times N$ matrix that switches the $i^{% \text{th}}$ element with the $j^{\text{th}}$ element in a vector of length $N $ with all other elements in the vector unchanged.
This does fall out using Newton's identities, much as I suspected in my first comment.
Newton's identity is pretty, if a little hard to take in: $$ e_n = {1\over n!} \sum_{(a)\in S_n} |(a)| \prod_{i=1}^n p_i^{a_i} $$ where the sum is over the conjugacy classes of $S_n$, equivalently the partitions of $[n]$. The class $(a)$ is composed of $a_i$ $i$-cycles, for each $i$. $e_n$ is the elementary symmetric polynomial of degree $n$, and similarly $p_n$ is the power-sum polynomial.
We observe that in the expressions for $u_i^{(n)}$ the coefficient of $v_i^r$ matches the right-hand side of Newton's identity in $n-r$, by virtue of the initial condition that $p_1 = 0$. [This needs a little work to prove, but the pattern is very clear!]
So substituting into the expressions for $u_i^{(n)}$, $$ u_i^{(n)} = \sum_{r=0}^n (-1)^{(n-r)}e_{n-r} v_i^r $$ If we view this as a polynomial in $v_i$, what are the roots? This is pretty trivial at $n=N$, as this is just the standard expression for the expansion of a product of linear terms. Therefore $$ u_i^{(N)} = \prod_{j=1}^N (v_i - v_j) = 0 $$ And we are done :-)