I am using Newton's method under MATLAB to find the roots of $f(x) = x^3 -10x +2 $.
The professor has asked us to verify that the algorithm always converge to a roots of $f(x)$ except for a finite number of points.
I've tried $\sqrt{ \frac{10}{3} }$ and $-\sqrt{ \frac{10}{3} }$ because $f '(x) = 0$ at these points (thus $ x_{n+1} = x_n - \frac{f(x_n)}{f'(x_n)} $ cannot exist). I'm not sure because when I put these points in the MATLAB's algorithm, it converges to a root of $f(x)$. Maybe it's because MATLAB works with numerical values?
Suppose $f(x)=ax^3+bx^2+cx+d$ is a cubic with three real roots $r_1<r_2<r_3$, as is the case in the specific case you are considering (in your case, there is a root between $-4$ and $-3$, another one between 0 and 1, and another between 3 and 4). Let $N_f$ be the function associated to Newton's method applied to $f$, that is, $$ N_f(x)=x-\frac{f(x)}{f'(x)}.$$ Note that the fixed points of $N_f$ are precisely the roots of $f$.
Let $c_1<c_2$ be the critical points of $f$. Note that $N_f$ is defined at all $x$ except at $c_1$ and $c_2$, since $f'(c_i)=0$. Also, let $u$ be the inflection point of $f$, so $r_1<c_1<r_2<c_2<r_3$ and $c_1<u<c_2$. Note that $$ N_f'(x)=\frac{f(x)f''(x)}{(f'(x))^2}. $$ In particular, $N_f$ is monotone in the interval $(c_1,u)$ and in the interval $(u,c_2)$.
Denote by $O_x$ the orbit of the point $x$ under $N_f$, that is, the set $\{x,N_f(x),N_f(N_f(x)),\dots\}$, under the convention that the set is finite if, for some $n$, the $n$-fold iterate $N_f(\dots(N_f(x))\dots)$ is one of the $c_i$ (in which case $O_x$ is the set of iterates of $x$ until we reach $c_i$). Otherwise, the set is finite if for some $n$, the $n$-fold iterate as above is one of the $r_i$. Otherwise, the set is infinite, unless the sequence $x,N_f(x),N_f(N_f(x)),\dots$ eventually repeats itself. In the case of cubics, this happens for precisely two values of $x$, as described below. Now, if $f$ is more complicated than a cubic, $O_x$ could well be infinite and the sequence it describes be divergent, but in the case under consideration, it always happens that if $O_x$ is infinite, then the sequence of iterates of $x$ converges to one of the $r_i$. Let's write $O_x\to r_i$ in that case.
Under the assumptions just set up, and using the notation just described, we can find an infinite increasing sequence $t_1<t_2<\dots$, an infinite decreasing sequence $s_1>s_2>\dots$, and points $t$ and $s$ such that $\lim_{n\to\infty}t_n=t$, $\lim_{n\to\infty}s_n=s$, $$ \begin{align} r_1<c_1<t_1<t_2<\dots<t<&\,r_2<s<\dots<s_2<s_1<c_2<r_3,\\ t<&\,u<s,\end{align} $$ and the following happen:
Note that Newton's method starting at $x$ diverges if and only if $x$ belongs to the infinite set $$\{c_1\}\cup\{t_i:i=1,2,\dots\}\cup\{t,s\}\cup\{s_i:i=1,2,\dots\}\cup\{c_2\}.$$ Otherwise, it converges as specified above. In particular, note that we can find values of $x$ arbitrarily close to $t$ (or to $s$) such that $O_x\to r_i$ for $i=1,2,3$. One has to be particularly careful using numerical methods when starting near either of these two points $t,s$.
The dynamics of Newton's method applied to polynomials is a pretty topic. A reasonable exposition of the case of cubics can be found in