In our calculus class, we were introduced to the numerical approximation of root by Newton Raphson method. The question was to calculate the root of a function up to nth decimal places.
Assuming that the function is nice and our initial value does lead to convergence. Our teacher terminated the algorithm when the two successive iterations had the same first n digits and told us that the the approximation was correct up to the nth digit!
I feel that the termination step is valid if $f(x_n)$ and $f(x_{n+1})$ has different signs but my teacher disagrees. How do I sort this out??
Futhermore how do I find the error in the nth iteration without knowing the exact root?

To properly start Newton's method we begin by first localizing the root, finding a compact interval $I$ that contains it, ideally that contains only that root. Then we take the first approximation in that interval. We can say that the initial error $e_0$ is smaller than the length of the interval.
Now we can use the estimation of the $n$-th error
$$e_{n+1}\leq Me_n^2$$ where $M=\frac{1}{2}\sup_I\frac{f''(x)}{f'(x)}$. If you begin by putting $e_0=|I|$ (instead of the actual unknown initial error $\epsilon_0$) you can stop for sure when this recurrence gives you $e_{n+1}$ smaller than the precision you want.
The stopping condition that your teacher used is not correct in general.