With an option to reduce any pair satisfying a relation, repeatedly reduce a collection.

116 Views Asked by At

The game:

Consider the following single player game:

  1. The battlefield is a given (finite) collection $A$ of natural numbers. Example: $\langle 1, 2, 3 \rangle$

  2. For every pair $x, y \in A$, whenever $x > y$, $x$ may be replaced with $x - y$, which I call reduction. A reduction constitutes a turn in the game.

  3. The game ends when no more turns are possible.

At any point into the game, there would usually be a selection of turns available to choose from, so there will be many different strategies for playing the same setup, unless it is trivial.

 

example 1   $ \langle 1,2,3\rangle \to \langle 1,1,2 \rangle \to \langle 1,1,1 \rangle $

example 2   $ \langle 1,2,3\rangle \to \langle 1,2,2 \rangle \to \langle 1,2,1 \rangle \to \langle 1,1,1 \rangle $

 

proposition 1   We can see that a game on any battlefield must eventually end, as:

  1. Numbers are always made smaller.
  2. But can never be made smaller than 1.

$\square$

proposition 2   A game ends exactly when all the numbers are the same.

Otherwise there is a possibility for a turn to be made.

$\square$

corollary   We may associate an ending with an "outcome" number.

proposition 3   Order of numbers on the battlefield does not matter.

 

My questions:

  1. Does a game on any battlefield only have a single outcome, regardless of strategy?

  2. Can outcomes be predicted without actually playing?

  3. Is there a strategy that ends the game on any battlefield the fastest?

  4. Does repetition of numbers on the battlefield matter? In other words, may the outcome change if the battlefield is converted to a set?

I can see how the outcome (assuming its uniqueness) is bounded above by the smallest number available and, furthermore, by the remainder of the division of any two numbers present on the battlefield that do not divide evenly (since integer division is a series of subtractions). For example: if there is $1$ on the battlefield, the game will necessarily end with $1$; if there are $13$ & $27$, the outcome is again $1$. But a remainder obtained at some point in the game may possibly in turn be used to obtain an even smaller remainder! I have no idea how to pinpoint the end of this sequence of ever smaller remainders.

3

There are 3 best solutions below

0
On BEST ANSWER

Because the gcd of $b_1, \ldots, b_n$ is just $$ \gcd(b_1, \gcd(b_2, \ldots, b_n)) $$ you can "reduce" things by reducing the first pair of numbers $b_1, b_2$ (subtracting the smaller from the larger, i.e., applying the euclidean algorithm) until they're the same, say $c_0$, then reduce the pair $c_0, b_3$ until they're the same, say $c_1$, and so on until all numbers are identical. And because of the gcd property above, the the euclidean-algorithm computation of the gcd, the final result, $c_n, c_n, \ldots, c_n$ will have all entries equal to the $\gcd$ of the original numbers, as @RobArthan said.

So the answer to question $1$ is "yes," the only possible outcome is that all the identical numbers are in fact the $\gcd$ of the original numbers.

That implies the answer to question 2 is also "yes", since there's only one outcome, and we know what it is: it's the GCD. And that can be computed much more efficiently than repeated subtraction. For instance, you can reduce $b_1, b_2$ (where $b_1 > b_2$) to the pair $(b_1 \bmod b_2, b_2)$ whose first entry may be much smaller than $b_1 - b_2$.

The third question is a little ambiguous: for any input set, there's at least one sequence of "reduction choices" that gets to $(d,d,d,\ldots, d)$ fastest, and one optimal strategy is to choose that sequence. Because there are only finitely many sequences possible for any input, examining them all and picking a "best" one is certainly a (very inefficient) strategy.

You probably meant "is there a simple rule that plucks out this strategy and describes it compactly without trying all possibilities, something like 'always reduce the two largest numbers that remain', or 'always reduce the largest number against the smallest number'. etc.?" My guess here is that "reduce the two largest remaining numbers" is likely to produce an optimal sequence, but I don't have a quick proof.

4
On

This concerns only the solitaire game Let $g$ be the greatest common divisor of all the numbers. Note that a reduction does not change $g$, so no number can ever be smaller than $g$, and the ending position must be all $g$'s. Therefore the answers to question $1$ and $2$ are both "yes".

I'm not sure about question $3$. My first thought is to always subtract the second-largest number from the largest one, but that's just a guess. I haven't even tried any examples.

It certainly seems more interesting as a competitive game, but that's another question.

1
On

As @Rob Arthan said, it's the $\gcd$ of initial values. At first notice that the $\gcd$ is invariant over this action, that was the way to calculate $\gcd$ in fact: $$\gcd(x,y,...)=\gcd(x-y,y,...)$$ It somewhere known as "continued fraction" method.

For fastest way, you must sort your numbers at each round, then always subtract two greatest of each other. The longest way is to subtract the smallest of greatest.

For proof intuition notice that you should decrease your summation to $nd$ when we have $n$ numbers with $\gcd$ equals to $d$. So as fast as you can you should try.