Suppose I have two numbers, A,B. Suppose I am allowed to add them together, but can only add one at a time. So, for example:
Seed Case: (A, B)
Step 1: (A+B, B) or (A, A+B)
Step 2: (A+2B, B) or (A+B, A+2B) or (2A+B, B+A) or (A, B+2A)
Step 3: (A+3B, B) or (A+B, 2A+3B) or (2A+3B, A+2B) or ...
... etc
I am wondering if anyone knows of the formal name of this sequence, as I am trying to find information on its characteristics. Specifically, I am interested in the minimum distance between the base case and a given term (in terms of intermediate additions which must occur). But I don't know where to start researching it.
I'll write $a$ and $b$ in lowercase. Let $X = \left[ \begin{array}{cc} 1 & 1 \\ 0 & 1 \end{array} \right]$ and $Y = \left[ \begin{array}{cc} 1 & 0 \\ 1 & 1 \end{array} \right]$. Each expression you describe here can be obtained by starting from the vector $v = \left[ \begin{array}{cc} a \\ b \end{array} \right]$ and repeatedly applying $X$ or $Y$, since
$$Xv = \left[ \begin{array}{cc} a + b \\ b \end{array} \right], Yv = \left[ \begin{array}{cc} a \\ a + b \end{array} \right].$$
So the terms in Step 2 correspond to $X^2 v, XY v, YX v, Y^2 v$, for example. In general you are looking at the monoid generated by these two matrices, and you want to compute the length of an element of this monoid (the shortest word in $X$ and $Y$ that produces it).
This is actually quite easy, for the following reason: if you start from an expression $(n_1 a + m_1 b, n_2 a + m_2 b)$ (equivalently, a matrix $\left[ \begin{array}{cc} n_1 & m_1 \\ n_2 & m_2 \end{array} \right]$) and want to work backwards to the base $(a, b)$, given that this is possible there is a unique way to do it at each step, because exactly one of the two expressions can be subtracted from the other and not vice versa. For example, given $(a + b, 2a + 3b)$ in Step 3,
In other words, the sequence of $X$s and $Y$s generating a given expression is unique; this says that $X$ and $Y$ generate the free monoid on two generators.
When performing these reductions the coefficients $n_i, m_i$ are being subjected to a slight variation of the Euclidean algorithm so that's a place you can look for bounds on the total number of steps; generally speaking they will be logarithmic although there'll be edge cases, e.g. $(a + nb, b)$ takes $n$ steps. You get basically the Euclidean algorithm if you decide that adding $n$ copies of one of the numbers to the other counts as one step and this simplifies the counting of the total number of steps.