How to Calculate the Precision Required to Exactly Produce the Repeating Sequence of Digits in the Decimal Expansion of a Fraction

83 Views Asked by At

Say I am given a fraction $\frac{p}{q}$ and wish to express it as a decimal such that the repeating sequence of digits is accurately displayed at least once and preferably twice in the printed output. How would I calculate the machine-precision required in order to do this?

1

There are 1 best solutions below

0
On

One way of dealing with this would be to consult the sequence https://oeis.org/A051626 this is the Period of decimal representation of 1/q and the period of p/q should be limited at at least this length.