Let's assume p1 and p2 are even numbers(p1≠p2) and gcd(p1,m)=1, gcd(p2,m)=1, where m is a positive integer, prove that there exist infinitely many m so that gcd(p1+m,p2+m)=1.
m,m^2,...,m^k are natural numbers, where m is randomly chosen natural number so that gcd(p1,m)=1, gcd(p2,m)=1. Prove that for sufficiently big k probability of gcd(p1+m^n,p2+m^n)=1 is bigger than gcd(p1+m^n,p2+m^n)>1,where n = 1,2,3...k.
Given $p_1,p_2\in \Bbb N$ with $p_1<p_2$ and given any $n\in \Bbb N,$ we can find $m>n$ such that $m+p_1,m+p_2$ are co-prime, as follows:
Let $p$ be prime with $p>p_1+n$ and $p>p_2-p_1.$ Let $m=p-p_1$.
Then $m=p-p_1>(p_1+n)-p_1=n.$
Now $p_2+m>p_1+m=p,$ but the prime $p$ does not divide $p_2+m$ because $$1< \frac {p_2+m}{p}=\frac {(p_2-p_1)+p}{p}<\frac {p +p}{p}=2.$$
Therefore $\gcd(p_1+m, p_2+m)=\gcd (p,p_2+m)=1.$