Two equivalent definition of derivative, but where does the Taylor remainder come from?

89 Views Asked by At

A function of real variable $y=f(x)$ has a derivative $f'(x)$ when $$(1)\qquad\lim_{h\to 0}\frac{f(x+h)-f(x)}{h}=f'(x)$$ If, however, $x$ is a vector variable, $(1)$ makes no sense. For what does it mean to divide by the vector increment $h$? Equivalent to $(1)$ is the condition $$f(x+h)=f(x)+f'(x)h+R(h)\:\Longrightarrow\:\lim_{h\to 0}\frac{R(h)}{|h|}=0$$ which is easy to recast in vector terms.


The above definition is taken from Charles Pugh's Real Mathematical Analysis.

I am not sure where the Taylor's remainder came from. Especially if I were to start from $(1)$, how do I make the remainder term magically appear? Furthermore, why do we divide by the absolute value of $h$, as opposed to just $h$?