Obtaining the gradient of a vector-valued function

574 Views Asked by At

I have read that obtaining the gradient of a vector-valued function $f:\mathbb{R}^n \to \mathbb{R}^m$ is the same as obtaining the Jacobian of this function.

Nevertheless, this function has only one argument (the vector $\mathbf{x} \in \mathbb{R}^n$)

How can I take the gradient of a function $F(\mathbf{x}_1, \mathbf{x}_2, \ldots, \mathbf{x}_n)$ with respect to some $\mathbf{x}_i$?

2

There are 2 best solutions below

2
On

The 'gradient' of something usually means taking all partial derivatives. Therefore

"taking the gradient of a function $F(x_1,x_2,....,x_n)$ with respect to some $x_i$"

is not really a thing. However you are right that all partial derivatives of a vector valued function arranged in a $m\times n$ matrix are usually referd to as Jacobian.

0
On

A gradient of something with $n$ input indices and $m$ output indices will have $n\times m$ indices. In this sense it is a tensor outer product. For example $1-$tensor in like for example $3\times 1$ vector, $1-$tensor out $3\times 1$ vector.

Since $1+1 = 2$, your output becomes a 2-tensor (="matrix") (this is common for 3D vector field functions).