I have two numbers, $a$ and $b$, where the precision of $a$ is $n$ bits and the precision of $b$ is $m$ bits. How many bits of precision are preserved after calculating a/b on a normal computer nowadays?
I need the case where $n = 52$ and $m = (52-k)$.
Note: I need this to calculate $\lceil n/\ln(n!)\rceil$ and $\lfloor n/\ln(n!)\rfloor$. For a given $n$ I want to know how many bits of precision are necessary. The logarithm is calculated as the sum of the logarithms of the factors $1$ to $n$.
(This sounds like a trivial question, but I could not find anything related to this on google by searching for anything like "precision after division". Most of the questions are answered by "you loose precision because the number is stored in binary", which is not what I am looking for at all.)