PRP testing algorithms

81 Views Asked by At

INTRODUCTION: (scroll down for the actual question)

I want to check the primality of large numbers (more than $100\ 000$ digits). Moreover, I want to be able to eliminate the use of expensive (yes, in that range of numbers it is quite expensive) modular exponentiation.

One way to do this is by trial factoring the number up to a certain limit. This method especially pays off with a recursive way to go from one number that needs to be checked for divisibility to the next, e.g. if we know that $2^{123456789}\equiv 16 \pmod{31}$ we can say that $31$ is not a factor of $2^{123456789}-1$, but it is a factor of $2^{123456789+1}-1$ as $16\cdot2 \equiv 1 \pmod {31}$, which was quite efficient as we did not have to use modular exponentiation again for the second number.

Typically, after trial factoring up to a small limit like $10\ 000$ only $1$ in every $16.42449$ numbers cannot be factored. Afterwards we need to go all the way up to $100\ 000\ 000$ to get another $1$ in every $2$ reduction.

The (weak) Fermat PRP test correctly identifies a prime with a probability of at least $\frac{1}{2}$ (this fraction can be improved upon by using other types of tests). But, the problem is, this takes too long (for me) in the desired range of the numbers.

Also, note that for numbers this big, better approximations can be made. If we look at the (Fermat) pseudoprimes to base $2$ (discarding any primes, these are called Poulet numbers or Sarrus numbers) an upper bound for the amount of Poulet numbers less than $x$ (for sufficiently large $x$) can be found, namely $$P_2(x) < x \exp \left( - \frac{\ln x \ln \ln \ln x}{2 \ln \ln x} \right) = f(x)\,.$$ An approximation for the ratio of Poulet numbers to prime numbers below $n$ thus is $f(n)/(n/\ln n)=$$f(n)\ln n/n$. This tells us that if a number of at least $2\ 000$ digits is pseudoprime to base $2$, then there is approximately a $1$ in $10^{249}$ chance that the number is composite (based on the ratio of Poulet numbers to prime numbers below $10^{2000}$ which only decreases for higher numbers).


THE QUESTION: After trial factoring large numbers ($> 100\ 000$ digits) up to a certain limit, what is the fastest way to eliminate a lot of numbers (preferably parallelizable) that are not prime, where recursive ways to go from one number to the next are known, without using classical modular exponentiation (or Lucas) PRP tests?

In short, trial factoring from $10\ 000$ to $100\ 000\ 000$ only eliminates about half of all non-primes and Fermat PRP testing takes too long eliminating nearly all non-primes, is there a middle way (i.e. a fast algorithm eliminating more than $\frac{1}{k}$ of all non-primes)?