I was running a statistical simulation, where I'd generate random numbers from 1 to $N$, where $N$ varies from 1 to $10^6$. I would keep generating random numbers until the current random number was a multiple of one of the previous numbers. For example:
If $N = 10$
$5, 8, 6, 7, 10$ -> We stop at 10 because it's a multiple of 5. Total count is 5
$10, 2, 4$ -> We stop at 4 because it's a multiple of 2. Total count is 3
For each $N$, I ran 1000 trials and averaged the total count for each range. I came up with the following graph, which is almost like a square root equation:

Why does it start to converge to what seems like some constant * $\sqrt{N}$? Does anyone have a mathematical explanation for this trend occurring?
Upon using a fit line of y = Ax^0.5, I have A to be 0.4486 and a correlation of 0.9970.

Edit: I coded this in C++ here for any coders out there.