Length in time to find the longest range of primes between 2 and a 13 million character digit?

63 Views Asked by At

I am trying to run a program that tells me how many prime numbers there are in a range of numbers. I run it in intervals of 10,000 to 100,000. How long would the program take to determine all the prime numbers between 2 and the largest known primes (with about 13 million digits)
for 10,000 it took 0.494s
for 20,000 it took 1.100s
for 30,000 it took 1.965s
for 40,000 it took 3.149s
for 50,000 it took 4.579s
for 60,000 it took 6.305s
for 70,000 it took 8.108s
for 80,000 it took 10.343s
for 90,000 it took 12.560s
for 100,000 it took 15.091s

According to what I have now, How would I find the solution? I do understand that this will take a very long time, but I cannot answer the question by saying that. Also I have asked this question before on mathoverflow but they told me it was the wrong spot to ask the question. If this site is also the wrong place to ask this question, will some one guide me to the right site?

Thank you in advance.

1

There are 1 best solutions below

0
On

Since you're interested in how long it takes your implementation, rather than a generic "how long would this operation take", it probably belongs on the Programmers or Code Review site. But I won't let that stop me from playing with it.

Let's be super optimistic and assume the growth rate will remain constant. We really need much more data, but lets go with what we have. Every factor of 10 takes 30 times longer. So we might expect it to take 15.091*30^1 = 453 seconds for 1M, 15.091*30^2 = 13582 seconds for 10M, etc. Under the very optimistic assumptions that the program would work at large sizes and keep the same growth rate, it would take about $15.091 * 30^{17,425,170-5}$ seconds = about $2 * 10^{25,739,075}$ years.

To answer the question in general, we'd normally say that you almost certainly don't need the exact answer, but an approximation of the number of primes. We could do a quick $n/\log(n)$ or get fancy and use Riemann's R function and check with Dusart 2010 bounds. This takes a few minutes at most. I come up with $1.45026... * 10^{17,425,162}$ as about the number of primes in the range 2 to the current largest known prime.

If you really wanted to sieve, look at something like primesieve which is the fastest current implementation. It has slightly superlinear growth and still takes 15 minutes to go to just $10^{13}$. Using the same wildly optimistic assumptions about speed we made earlier, we could take a stab at $926.83 * 12^{17,425,170-13}$ seconds = about $1 * 10^{18,804,898}$ years. Looks like sieving isn't going to work -- even if we got linear growth and massive parallelism the number is just too big.

Even little ranges like $2^{64}$ can be quite daunting if you have to touch every number. The wonderful PSP2 database of Feitsma/Galway used very clever methods to cut the time taken, but still took years of computation to fill out the range. Now imagine that little range in relationship to a Mersenne prime.