Total bits of accuracy gained per iteration with Newton's Method?

82 Views Asked by At

With respect to a one dimensional nonlinear equation, how many bits of accuracy are gained per iteration with Newton's method?

I know I can use Newton's method to find an arbitrary root and then just compare the bits of accuracy for each iterative error. However, is there a more general method to prove this regardless of the equation?