Is there any error bound theory for a Derivative Estimator?

41 Views Asked by At

Given f(x) $\in$ $C^1(\Omega)$ and a finite set of samples $\{x_s\} \in \Omega$, one wants to estimate $\frac{d}{dx} f(x)$ at ${x_s}$, by finite difference or other methods. An intuitive idea is that the best quality of estimation should depend on the given function $f$ and the samples $x_s$ which means no matter what method is applied one shouldn't expect an estimation better than this bound. Do we have a theory on such bound yet? Or conversely, given an estimation $\widehat{\frac{d}{dx}f(x)}$, is it possible to determine whether or not it is an optimal estimation?