You are given $n$ integers $x_1,x_2,\dots,x_n$ satisfying $\sum_{i=1}^n x_i=0$. A legal move is to choose an integer $a$ and two indices $i,j$, and to increase $x_i$ by $a$ and decrease $x_j$ by $a$. The goal is to have $x_1=x_2=\dots=x_n=0$.
What is an algorithm which achieves this goal in the fewest number of moves?
There is an algorithm which always takes $n-1$ moves; at the $i^{th}$ step, decrease $x_i$ by $x_i$ and increase $x_{i+1}$ by $x_i$. This is not always optimal, because if half of the $x_i$ equal $1$ and the other half equal $-1$, then $n/2$ moves suffice.
The motivation for this problem is the situation where $n$ friends have gone out to dinner, and have all contributed some amount of money bill, and now want to make it so they have all payed an equal amount.
Suppose we have some minimum size set of payments, then I claim the resulting (undirected!) payment graph is always a forest. Indeed, suppose the graph has a cycle, then we can adjust all the payments on this cycle by $\pm 1$ until one of the edges becomes a payment of $0$ and disappears. We can use this procedure to find an optimal way to resolve any $S \subseteq [n]$ with $\sum_{i\in S} x_i = 0$ using $|S|-1$ payments.
Clearly then, an optimal solution will partition $[n]$ into subsets $S_1,\dots,S_k$ such that $\sum_{i\in S_j} x_i = 0$ for all $1\leq j \leq k$ and resolve each subset using $|S_j|-1$ payments. In other words, the optimal solution has value $n - k$ where $[n] = S_1 \cup \dots \cup S_k$ is the largest decomposition into subsets summing to $0$.
However, note that if $S_1$ and $S_2$ sum to $0$, then so does $S_1 \cup S_2$, so equivalently we may find a maximum length chain $\emptyset \neq S_1 \subseteq \dots \subseteq S_k = [n]$ and resolve all sets $S_{i+1} \setminus S_i$ individually using $|S_{i+1} \setminus S_i|-1$ payments. We can find such a maximum length chain in $O(n2^n)$ time by precomputing for every subset $S \subseteq [n]$ the value $s(S) = \sum_{i\in S} x_i$ and then just using the recurrence:
$$b(S) = \begin{cases} 0 & \text{if $S = \emptyset$} \\ 1 + \max_{u\in S} b(S\setminus\{u\}) & \text{if $s(S) = 0$} \\ 0 + \max_{u\in S} b(S\setminus\{u\}) & \text{if $s(S) \neq 0$} \\ \end{cases} $$
This is again $O(n2^n)$ time. As was pointed out in the other answer we are unlikely to find a polynomial time algorithm.