Let $\alpha$ be a fixed point of a contractive function $g$ defined in $[a, b] \subset \mathbb{R}$ such that $g'(\alpha) = 0$ and $g''(\alpha) \neq 0$.
My first question is how to prove that $x_{k + 1} = g\left(x_k\right)$ converges to $\alpha$. I think that we can only use that $g$ is contractive, but how could we show it rigorously?
Then we know, by the Taylor's expansion of order $2$ of $g$ centered in $\alpha$, that the fixed point method has order $2$. I want to deduct heuristically the Newton's method to find a zero of $f$ applying the previous property to a fixed point function $g(x) = h(x)f(x)$ for some $h$. I'm so confused and I don't know how to use it...
You can write $g(x)=x+g_1(x)$. If one or more of the roots of $f$ are to be fixed points of $g$, then it has to be also a root of $g_1$, which is easiest to arrange via $g_1\sim f$ or $g_1(x)=h(x)f(x)$. Then the derivative of g is $$ g'(x)=1+h'(x)f(x)+h(x)f'(x). $$ The middle term is zero at the roots of $f$, and $1+h(α)f'(α)=0$ is easiest to guarantee if $h(x)=-f'(x)^{-1}$ everywhere.
Of course it is more intuitive to develop the Newton step as tangent root or as value at zero of a linear approximation of the inverse function.