For example, I have $K$ points of the form $(x_k,y_k,f(x_k,y_k))$ for $k=1, ..., K$ near $0$. The distances between the points and $0$ are not the same.
Is there an approximation for first order derivative of $f$ (i.e. $\frac{\partial f}{\partial x}$ and $\frac{\partial f}{\partial y}$) at $x=0$?
I know it can be find by differentiating the polynomial best fitted to the points, but I would like to see if there is a clean expression (in terms of distance between points and origin, and $f(x_k,y_k)$) about it.
Thanks.
Let $ z_k = (x_k, y_k) $. The gradient of $ f $ at $0$ should satisfy: \begin{equation} f (z_k) \approx f (0) + \langle \nabla f (0),z_k \rangle \end{equation} for all $ k $. This is like an overdetermined system of linear equations for the gradient. It seems like it would be a good idea to find a least squares solution to this overdetermined system and let that be your estimate of the gradient. (Or perhaps you should minimize the $1$-norm of the residual, if you think some of the points $ z_k$ might not be close enough to the origin for the above approximation to be accurate.)
If $ K $ is large, I bet you'd get a more accurate approximation by simultaneously estimating the gradient and the Hessian of $ f$ at $0$, based on the approximation \begin{equation} f (z_k) \approx f (0) + \langle \nabla f (0), z_k \rangle + \frac12 z_k^T Hf (0) z_k. \end{equation} Once again, this gives you an overdetermined linear system (for both the gradient and the Hessian) and you can find a least squares solution or minimize another norm of the residual.