Can Anyone Explain This Property of the Collatz Conjecture ($3n + 1$ problem)

229 Views Asked by At

I'm making a program that draws out a tree containing the first $n$ numbers, and how they all reduce to $1$. I noticed something interesting about the number of nodes that are required to connect numbers $1-n$ to $1$ that are not $1-n$. If you draw out a tree that contains numbers $1-n$, there will always be approximately $1.17 \cdot n$ numbers required that are not from $1-n$ to fill in the tree. In case that wasn't very clear, let me give an example. To place the numbers $1-10$ on a tree, you also need numbers $16,20,40,13,26,52,17,34,11,22,14,$ and $28$ to connect $1-10$ to $1$. That's $12$ other numbers, which is exactly $10 \cdot 1.2$. I know that's not the $1.17$ I promised, but that's because the sample size is quite small. Take a look at this spreadsheet: link . You can see that for much larger trees, the average is very close to $1.17$. Can anyone explain this? Has this already been answered? I feel like this can't be just a coincidence.

Edit: If it could be proven that this quotient tends toward $1.17$ or some number near it as $n \to \infty$, then that means that there are no numbers that diverge, because then the quotient would approach $\infty$.

Edit 2: Tested it for a tree with numbers $1 - 1,000,000$ - still right around $1.17$

Edit 3: Tested all trees with numbers $1 - 1000, 1-1001,..., 1-10000$, and the average was $1.167687636032841$, with a standard deviation of $0.015784505982180175$

Final Edit: Tested all trees $1 - 10,000, 1 - 10,001,..., 1 - 100,000$. The average was $1.169015280407125$ and the standard deviation was $0.004822374427267592$