Differentiation along line search and its relationship with gradient

44 Views Asked by At

I have a continuously differentiable function $f:\mathbb{R}^n\rightarrow \mathbb{R}$. Now during line search part of optimization I need to differentiate $f(x+\alpha d)$ wrt $\alpha$, where $\alpha \in \mathbb{R}_+$ and $d \in \mathbb{R}^n$. In other words, let $g:\mathbb{R}\rightarrow \mathbb{R}$ be a function such that $g(\alpha)=f(x+\alpha d)$ that needs to be differentiated.

Is this always true: $g'(\alpha)=\nabla f(x+\alpha d)^Td$.

2

There are 2 best solutions below

0
On BEST ANSWER

Yes. You need the chain rule: for $ \alpha \in \mathbb R$ let $h(\alpha)=x+\alpha d$. Then we have $g(\alpha)=f(h( \alpha))$. Since $f$ and $h$ are differentiable, $g$ is differentiable and

$$g'(\alpha)=f'(h(\alpha))h'(\alpha)=\nabla f(x+\alpha d)^Td.$$

0
On

Yes. It follows from the chain rule, knowing that differentiability implies that all partial derivatives exist. Thus we obtain \begin{equation} g'(\alpha) = \frac{df(\boldsymbol{x}+\alpha \boldsymbol{d})}{d\alpha} = \sum_{i=1}^n \frac{\partial f}{\partial x_i}(\boldsymbol{x}+\alpha\boldsymbol{d}) \frac{d(x_i + \alpha d_i)}{d\alpha} = \sum_{i=1}^n \frac{\partial f}{\partial x_i}(\boldsymbol{x}+\alpha\boldsymbol{d}) d_i = \nabla f(\boldsymbol{x} + \alpha \boldsymbol{d}) \cdot \boldsymbol{d}. \end{equation} The expression on the right-hand side is also called the directional derivative of $f$ in the direction $\boldsymbol{d}$ at the point $\boldsymbol{x}+\alpha \boldsymbol{d}$, which makes sense since $\boldsymbol{d}$ is precisely the direction of your line search.