Sorry for posting such a simple question but I'd like to know if somebody knows a derivation of Newton's Decrement? A few hours of googling hasn't led to an answer(mostly we just define it without derivation I guess see below) so that I would like to borrow your knowledge.
Definition of Newton's Decrement
For instance,
- Page 486 of Professor Boyd's Convex Optimisation book
- Some Lecture notes: Eq 7.1 / slide 2 / slide 3
define Newton's decrement as $$ \lambda(x) = \Big( \nabla f(x)^{T} \nabla^{2} f(x)^{-1} \nabla f(x) \Big)^{1/2} $$ And this quantity normally is being used as a stopping condition of the Newton's Method.
We want to minimize $f(\mathbf{x})$. Firstly, we consider its second-order truncated Taylor series:
$$f(\mathbf{x}) \approx \hat{f}(\mathbf{x})= f(\mathbf{x_0}) + [\mathbf{x}-\mathbf{x_0}]^T\,\nabla f(\mathbf{x_0}) + \frac{1}{2}\,[\mathbf{x}-\mathbf{x_0}]^T\,\nabla^2 f(\mathbf{x_0})\,[\mathbf{x}-\mathbf{x_0}]$$
Considering the Hessian Matrix $\nabla^2 f(\mathbf{x_0})$ is positve-definite, $\hat{f}(\mathbf{x})$ can be easily minimized:
$$\frac{\partial \hat{f}(\mathbf{x^*})}{\partial \mathbf{x}} = \nabla f(\mathbf{x_0}) + \nabla^2 f(\mathbf{x_0})\,[\mathbf{x^*}-\mathbf{x_0}] = \mathbf{0} \Rightarrow \mathbf{x^*} = \mathbf{x_0} - \nabla^2 f(\mathbf{x_0})^{-1}\,\nabla f(\mathbf{x_0})$$
So we have the minimized value:
$$\hat{f}(\mathbf{x^*})= f(\mathbf{x_0}) - [\nabla^2 f(\mathbf{x_0})^{-1}\nabla f(\mathbf{x_0})]^T\nabla f(\mathbf{x_0}) + \frac{1}{2}[\nabla^2 f(\mathbf{x_0})^{-1}\nabla f(\mathbf{x_0})]^T\nabla^2 f(\mathbf{x_0})[ \nabla^2 f(\mathbf{x_0})^{-1}\nabla f(\mathbf{x_0})]$$
So the decrement on the approximation $\hat{f}(\mathbf{x})$ is given by:
$$f(\mathbf{x_0})-\hat{f}(\mathbf{x^*}) = \boxed{\frac{1}{2}\,\nabla f(\mathbf{x_0})^T\,\nabla^2 f(\mathbf{x_0})^{-1}\,\nabla f(\mathbf{x_0})}$$
(A Hessian Matrix is always symmetric for "well behaved" functions)