A function is given such that $ f(x)=0 $ and it can be written as $ x=g(x)$. Now, we have to determine the root of $f(x)$ (the value of x so that $ f(x)=0$). My textbook states that only when $|g’(a)|<1$ ( a is a random value of the function) , by using $x=a$ in interatio n method, the value of the root will be converging. In other words when $ |g’(a)|>1$, we can't determine the root because the value we get by using iterative method is diverging.
Is it true? For instance, $$f(x)=x^3-2x+3$$ $$g(x)=(2x-3)^\frac{1}{3}$$ By using $ x=1.6 $ in iteration method, the root is equal to $1.893$. But $g’(x)=1.949$ which isn't less than 1. I am wrong or the textbook is wrong?
What you claim the book says is certainly not true. The condition $|g'(a)|<1$ must hold not for a random number $a$, but for $a$ the root of the equation.
What the book probably explains is that if $|g'(x)|<1$ for all $x$, then you can pick at random the first point of the iteration, and the sequence of iterations will converge to the solution.