Prove Newton's iteration will diverge for these functions, no matter what real starting point is selected.
$f(x)=x^2+1$ and $g(x)=7x^4+3x^2+\pi$.
We know that $f(x)>0$ and $g(x)>0$ for all $x\in \mathbb{R}$, so there does not exist a real solution to these polynomials and Newton's Method doesn't apply. Is this basically what the proof of this is or do I need to do it a different way? I don't see how to do this any other way. Any solutions or hints are greatly appreciated.
The recurrence for Newton iteration is $$x_{n+1} = x_n - \frac{f(x_n)}{f'(x_n)}$$ If $x_n$ converges to some limit $L$, taking limits in both sides of this equation you have $$L = L - \lim_{n \rightarrow \infty} \frac{f(x_n)}{f'(x_n)}$$ In other words, $$\lim_{n \rightarrow \infty} \frac{f(x_n)}{f'(x_n)} = 0 \tag 1$$ If $f'(L) \neq 0$, then you have $$ \frac{f(L)}{f'(L)} = 0$$ In other words $f(L) = 0$. Since both of your functions have no zeroes, this case won't happen.
So you have $f'(L) = 0$. What remains is to show $(1)$ is impossible in this situation. But now $f(x_n)$ converges to $f(L)$, while $f'(x_n)$ converges to zero. Since in both cases $f(x)$ never vanishes, you have $f(L) \neq 0$ and the ratio $\frac{f(x_n)}{f'(x_n)}$ diverges as $n$ goes to infinity. Hence $(1)$ doesn't occur.
The same argument shows that Newton iteration diverges whenever the function is nonvanishing.