Identity $f(x)=f(x^{*})+\int_{0}^{1}{∇f(x+te)^{T}e\space dt}$

53 Views Asked by At

This is a theorem from the book Iterative Methods for Optimization. Theorem 1.2.1. Let f be twice continuously differentiable in a neighborhood of a line segment between points $x^{∗}, x = x^{∗} + e ∈ R^{n}$ ; then $$f(x)=f(x^{*})+\int_{0}^{1}{∇f(x^{*}+te)^{T}e\space dt}$$

I cant see where this identity came from, neither for $R^{1}$, maybe some way of finding out would be to differentiate both sides of equality, but it didnt work for me.

2

There are 2 best solutions below

0
On BEST ANSWER

Consider the line segment

$\sigma(t) = x^\ast + te \subset \Bbb R^n, \; t \in [0, 1]; \tag 1$

then

$\sigma(0) = x^\ast, \; \sigma(1) = x^\ast + 1e = x^\ast + e = x, \tag 2$

and

$\sigma'(t) = \dfrac{d\sigma(t)}{dt} = \dfrac{d(x^\ast + te)}{dt} = e; \tag 3$

we have

$f(\sigma(t)) = f(x^\ast + te), \tag 4$

we compute

$f'(\sigma(t)) = \dfrac{df(\sigma(t))}{dt} = \nabla f(\sigma(t))^T \sigma'(t) = \nabla f (x^\ast + te)^T e; \tag 5$

now, by what is essentially the fundamental theorem of calculus,

$f(x) - f(x^\ast) = f(\sigma(1)) - f(\sigma(0)) = \displaystyle \int_0^1 f'(\sigma(t)) \; dt. \tag 6$

It may be seen the fundamental theorem implies (6) by looking at its individual components in some basis of $\Bbb R^n$; I leave the details to the reader; if we now substitute (5) into (6) we find

$f(x) - f(x^\ast) = f(\sigma(1)) - f(\sigma(0)) = \displaystyle \int_0^1 \nabla (x^\ast + te)^T e \; dt, \tag 7$

or

$f(x) = f(x^\ast) + \displaystyle \int_0^1 \nabla (x^\ast + te)^T e \; dt, \tag 8$

the requisite result.

0
On

Just take the derivative of $f(x(t))=f(x^*+te)$, you get $$ \frac{df(x(t))}{dt}=\nabla f(x^*+te)\cdot e $$ Upon integration $$ f(x^*+e)=f(x^*)+\int_0^1\nabla f(x^*+te)\cdot e dt $$ Here, I assumed $f(x)\in\mathbb R$. In general, with $f(x)\in \mathbb R^n$ you'll get the matrix $Df$. You can make the same reasoning for one element of $f$, for ex. $f_i(x)\in \mathbb R$, then you write in matrix format.