Proving a Partial Derivative Equivalence Using Taylor Series Expansion?

404 Views Asked by At

I'm studying computer vision, and one of the problems in my book is to prove that $\partial f/ \partial x = f(x+1) - f(x)$

It's been a while since I've touched Taylor Series, and so I'm not sure of the approach for this general form. I've found lots of reference online to the special properties/method of calculus of the derivatives OF the Taylor Series expansion, but how does one take the Taylor Series expansion of the above in order to satisfy the proof?

1

There are 1 best solutions below

0
On

Well, as I suppose that the book is not asking you to prove the equality $\partial f/\partial x = f(x+1)-f(x)$ in general, which is obviously incorrect from the mathematical point of view. However, in the context of image processing, $f(x+1)-f(x)$ can be a reasonably good approximation for $\partial f/\partial x$. An image can be viewed as a two-dimensional function defined over a discrete domain, which is the set of pixels of the image. In that context, $f(x,y)$ is the image intensity at the pixel on row $x$ and column $y$, and $f(x,y+1)$ would be the image intensity of the next pixel to the right.

Let us now assume that we have fixed a column, say $y$, and we want to compute the partial derivative along that column. Then, using the definition of derivative

$$ \partial f/\partial x \approx \frac{f(x+1,y)-f(x,y)}{x+1-x} = f(x+1,y)-f(x,y). $$

This is exactly what you have, except that you have omitted the constant variable $y$.

I think the best place to ask this kind of question is the Computer Science Stack Exchange.