Consider a list of $x_i$ and $f_i$ data. I know that $\Delta x = h = $ constant. I want to find (approximate) the derivatives at those points aiming for errors of $O(h^2)$. I choose Newton interpolation
$$ f(x) \rightarrow P(x) = f_0 + s\Delta f_0 + \frac{s(s-1)}{2!}\Delta^2 f_0 + \frac{s(s-1)(s-2)}{3!}\Delta^3 f_0 + \cdots $$
where $x = x_0 + sh \Rightarrow s = \dfrac{x-x_0}{h}$
$$ \frac{df}{dx} = \frac{dP}{dx} = \frac{dP}{ds}\frac{ds}{dx} = \frac{1}{h}(\Delta f_0 + \frac{2s-1}{2!}\Delta^2 f_0 + \frac{3s^2-6s+2}{3!}\Delta^3 f_0 + \cdots) $$
Aiming for errors of $O(h^2)$, means that I must omit all terms that are ~ $\Delta^n f_0$ for $n>2$. That which troubles me, is that in order to calculate the derivatives at the given points, we do not need all $f_i$ values. We need up until $f_2$ since, in the current form of the polynomial derivative, there will never appear an $f_i$ term where $i>2$. The only "solution" that comes into my mind is to change (generalize) $0 \rightarrow i$ in all $\Delta $ - terms. Would that contradict the polynomial's definition? How would I approximate a derivative value located at an $x$ that doesn't belong to the data set? Thanks in advance.
You are expanding $f(x)$ as a Taylor series at $x_0$, and this should only be used for the calculation of $f'(x_0)$. To compute $f'(x_i)$, it is desirable to expand $f(x)$ at $x_i$, so that you will fully utilize the data near $x_i$.
To approximate a derivative at some $x$ that doesn't belong to the data set, you simply choose some nearby $x_i$'s, and construct a interpolating polynomial, thus obtaining the derivative at $x$.