I have a continuously differentiable function $f:\mathbb{R}^n\rightarrow \mathbb{R}$. Now during line search part of optimization I need to differentiate $f(x+\alpha d)$ wrt $\alpha$, where $\alpha \in \mathbb{R}_+$ and $d \in \mathbb{R}^n$. In other words, let $g:\mathbb{R}\rightarrow \mathbb{R}$ be a function such that $g(\alpha)=f(x+\alpha d)$ that needs to be differentiated.
Is this always true: $g'(\alpha)=\nabla f(x+\alpha d)^Td$.
Yes. You need the chain rule: for $ \alpha \in \mathbb R$ let $h(\alpha)=x+\alpha d$. Then we have $g(\alpha)=f(h( \alpha))$. Since $f$ and $h$ are differentiable, $g$ is differentiable and
$$g'(\alpha)=f'(h(\alpha))h'(\alpha)=\nabla f(x+\alpha d)^Td.$$