Context: I'm starting another contest and I was interested in how one makes an extremely large finite number as simply as possible.
For starters, the Ackermann function is extremely simple:
$$A(0,n)=n+1\\A(m+1,0)=A(m,1)\\A(m+1,n+1)=A(m,A(m+1,n))$$
And this manages to reach the height of Knuth's up-arrow notation and Hyperoperations. There isn't much complicated about this function, and I'd say the reason it grows so fast is because it applies recursion against addition.
Indeed, this points out the strength of recursion to produce large numbers.
Likewise, there exists other ways to produce large numbers. One such example is the lifetime of a worm.
What principals make the lifetime of a worm so large?
What other computable ways can you make a large number?
It's the same principle for worms, hydras, Goodstein sequences, and even the fast growing hierarchy: a fixed "transformation rule" is being repeatedly applied to some particular kind of expression, stopping only when the expression reaches some special form (e.g. an empty string, an empty tree, the value $0$, etc.), the output being the number of repetitions required to accomplish this (or perhaps this number plus a starting number).
In all these cases, each expression $E$ can be seen as encoding a unique ordinal, $o(E)$, with the following properties:
Each iteration decreases $o(E)$.
If $o(E)$ is a successor ordinal, then the next iteration decreases $o(E)$ only to its predecessor.
If $o(E)$ is a limit ordinal, then the next iteration decreases $o(E)$ to an ordinal whose Cantor normal form (CNF) nevertheless has an increased number of terms each of form $\omega^\beta$ (in fact, an arbitrarily large finite number of such terms will not prevent termination).
NB: (1) guarantees that the game will eventually end (when the encoded ordinal reaches $0$), because there can be no infinite decreasing sequence of ordinals, and (3) allows increasingly enormous jumps in the number of terms in the CNF of $o(E)$, which (2) only reduces very slowly (by removing one term of form $\omega^0$ each time) -- thus tending to require enormously many iterations before termination.
Example
Aside from worms, the simplest possible hydra game might be the following (see the link for the ordinal encoding scheme): Let $E$ be any ordinary balanced parenthesis expression -- e.g., (()())(()) -- and let $n$ be a positive integer. Let the transformation rule be $$<n,E>\ \to\ <n+1,r_{n+1}\{E\}>,$$ where $r_n\{E\}$ is defined by
$$r_n\{A(B)\} = \begin{cases} A, & \text{if }B\text{ is empty} \\ A\ (r_n\{B\})^n, & \text{otherwise.} \end{cases} $$ Now any $<n,E>$ eventually reduces to some $<n^*,\text{ empty}>$. In particular, it can be shown that the function computed by starting with $<n,(((...)))>$ (with $n$ nested bracket pairs) strictly dominates Goodstein's function.
Other examples: starting with $<n,((()))>$ produces $n^*$ greater than the diagonalized Ackermann function $A(n,n)$, and starting with $<1,((()()))>$ produces $n^*$ greater than Graham's number.
Here's a tiny illustration for $<n,E>=<1,(()())>$:
NB: It's worth repeating that when $E$ encodes a limit ordinal, the increase in the number of terms in the CNF of $o(E)$ can be made arbitrarily large and finite; e.g., when $B$ is not empty, we could have defined $$r_n\{A(B)\} = A(r_n\{B\})^{g(n)}$$ where $g$ is a positive function -- no matter how rapidly increasing -- and the process would still eventually terminate.
The ordinal encoding scheme for worms is described in Beklemishev's paper, The Worm Principle.
A couple of links at the googology wiki:
Kirby-Paris hydra
Beklemishev's worms