Differentiating a vector valued function

59 Views Asked by At

If I have a function $y(x)=f(a+x(b-a))$ where $a, b$ are constant vectors, and $y: \mathbb{R} \rightarrow \mathbb{R}$, what would $\frac{dy}{dx}$ be in terms of $f$? I know the chain rule would be involved, but then I would obtain a vector $b-a$ times a scalar $f'(a+x(b-a))$ which isn't a real number. Would partial derivatives be involved? In particular I am trying to obtain a taylor series for y around x=0 and so would need to evaluate the derivative of y at x=0. Many thanks!

1

There are 1 best solutions below

0
On

$y(x) = f(g(x))$ where $g(x) = a + x(b - a)$. By the chain rule, $y'(x) = f'(g(x)) g'(x)$.

Note that $g'(x) = b - a$ (a column vector) for all $x$. Because $f:\mathbb R^n \to \mathbb R$, $f'(u)$ is a $1 \times n$ row vector (for any $u \in \mathbb R^n$). Hence, the product $f'(g(x)) g'(x)$ makes sense.