weird gcd problem in number theory

98 Views Asked by At

Suppose m is a positive integer and that $\frac{p}{q}$is defined as follows.

$\frac{p}{q}=\frac{p}{10^{m!}}=\frac{1}{10^{1!}}+\frac{1}{10^{2!}}+\cdots +\frac{1}{10^{m!}}$

Prove that the $\operatorname{gcd}(p,q)=1$

I realize we are supposed to post what we have already attempted but quite honestly despite spending a decent amount of time with this question i still don't understand it.

I did try multiplying both sides of the expression by $q$ i am reasonably certain $p\equiv 1 \mod 2 $ since i think it ends with a 1 so its odd and maybe $q \equiv 0 \mod 2 $ and $q \equiv 0 \mod 5 $ but that could also be completely the wrong train of thought cause theres an infinite number of choices of primes to mod out by though i guess q has only factors of $2, 5 $ by FTA...

1

There are 1 best solutions below

2
On BEST ANSWER

The only primes dividing the denominator are $2$ and $5$, and when you reduce to common denominator to sum the RHS you get that every term's numerator is divisible by $10$ except for the last one, which is $1$. So $p\equiv 1 \text{ (mod 10)}$. In particular, none of the two prime divisors of $q$ divides $p$, so the two numbers must be coprime.