Strange form of Taylor's Theorem for linearization

69 Views Asked by At

I came across a form of Taylor's Theorem that I have never seen before. The author states it without proof and says that it can be found in any calculus book, but I can't find it. Could somebody please show me the proof or point me towards a reference with the proof?

The author says:

If $f : R^n \rightarrow R$ is continuously differentiable and $p \in R^n$, then

$f(x+p) = f(x) + \nabla f(x+tp) p$

for some $t\in (0,1)$.

1

There are 1 best solutions below

0
On BEST ANSWER

This follows from the mean value Theorem: Define $$g: [0,1] \rightarrow \mathbb{R}: t \mapsto f(x+tp)$$ and notice that $$ g(1)-g(0)=f(x+p)-f(x).$$ By applying the mean value Theorem to $g$ we know there exists a $\xi \in [0,1]$ such that $g(1)-g(0)= g'(\xi).(1-0)$. Now use that $$g'(\xi) = \nabla f(x+\xi p) \cdot p. $$