Is there a formal definition for 'digits per term'?

147 Views Asked by At

I've seen the phrase 'digits per term', mostly with regards to algorithms that produce $\pi$. I've seen it here ("Yes, Chudnovsky's formula converges at a steady 14.18 digits per term."), on numerous formulas for $\pi$ ("This gives 50 digits per term", "...converges at only one bit/term") and on source code involving Khinchin's constant. I've fiddled with one of the algorithms, and it appears that it's -log(value of ratio test)/log(10), but I don't know why it's that value nor if it would be true for any method.

The phrase itself implies a linear convergent series, but could it be used on any converging series? It also seems to be more than just saying a series converges linearly or quadratically. For instance, given a series, could I say something like "it produces $≈0.7503$ digits per term" or "$0.023n^2$ correct digits per $n^{th}$ term are found"?

Just what exactly is it (if there is a definition for it) and how is it determined? Just where are these numbers coming from? Searching for said phrase just leads me to stuff about $\pi$.