I am working on some exercises for the Newton Algorithm and wanted to consult you all. For part c) I would like some help.
Just based on a graphical interpretation I would assume that $r$ just needs to be even because for odd $r$ the function is not convex on the entire domain, $x \in R$
Again for the global minima, I can show that the only stationary point occurs at $x=0$, and that the function is increasing on $[0, \infty)$ and on $(- \infty, 0]$ Is this sufficient?
I am a bit confused how to show the last part and would appreciate some hints. The newton algorithm has the form
$x^{k+1} = x^k + \alpha d_k$ where $d_k = \frac{\nabla f(x_k)}{||\nabla^2 f(x_k)||_2}$
Thanks for your help!
EDIT:
Thank you to the person who replied to me below. I have tried the following, compute derivative and second derivative and by analysing the second derivative, found that only for $r$ being even do we have that $f''(x) \geq 0, \forall x$.
For the stationary points I found the point where the derivative equals 0 which is $x* = 0$. Plugging this into the second derivative is useless so what I did is argue that for even $r$ we have a symmetric function whose derivative for $x>0$ is strictly increasing and so the point must be a global minima.
For part c) using the optimal $x* = 0$, the newton method became $x_k - \frac{\alpha x^k}{r-1}$ which I substituted into the inequality and got $$| $x_k - \frac{\alpha x^k}{r-1}| \geq (1-\delta)|x_k|$$ which upon rearranging I found that $\delta \geq \frac{\alpha}{r-1}$
What do you think? Is this valid/rigorous enough?

We just need $r$ to be be positive even number. In particular, we can choose $r$ to be $2$ if you wish.
Indeed we have
$$|x^k-\alpha \cdot \frac{x^k}{r-1}| = |x^k| \left|1-\frac{\alpha}{r-1}\right|$$
We can pick $\delta$ to be $\frac{\alpha}{r-1}$, provided we pick $\alpha$ to be less than $r-1$.