Question: To show that exponential grows faster than any polynomial numerically, a student thought of solving the problem $min \ f(x) = x^m -e^x$, where the $m$ highest even power of the polynomial. He uses a Newton method with step lenght 1 to solve this. Moreover, he thinks that if his claim is true than the sequence $\{x_k\}$ shall go to infinity as $f(x)$ can decrease as much as possible by increasing values of $x$. What is the mistake in his reasoning, explain it.
Attempt: First of all, I thought of showing for any $m>1$, $f(x)$ intersects the $x-axis$ by using the IVT. Then, since the sequence ${x_k}$ generated by the Newton Method should converge to a root of $f(x)$, it shall go to any root of this function not to infinity.
Note that Newton's Method algoritm we should use:
Algorithm: Given $x_0 \in R$, $k=0$ and $||f(x_k)||> \epsilon $, we calculate $x_{k+1} = x_k - \cfrac{f(x_k)}{f'(x_k)}$, and set $k=k+1$, return $x_k$.
However, it feels like I'm missing some points in this question. I'm open to any suggestion.
EDIT: I'm not asked to solve this! The question is given as this. I try to point out the errors in the formulation of this optimization problem in a formal manner.
Actually, there is an error in his reasoning since what it is said is: if the exponential grows faster than the polynomial, then the sequence is going towards infinity. But, he uses the converse: if there is a sequence that goes towards infinity, then the exponential grows faster than the polynomial. This is not a general property. As an example: $$ \min_x \ 1/x - c \ . $$ Maybe, if he added the fact that the minimum (actually an infinimum) is $-\infty$, it would have been OK, but still, there is no such result I am aware of and he needs to prove it first.