I know that backpropagation algorithm is useful to computing gradient of Error function (a scalar function) of a neural network respect to its weights.
In a paper I read that the backward pass of a neural network could be used to compute the Jacobian matrix of a function $F:\mathbb{R}^N\rightarrow\mathbb{R}^M$ (in particular it says that it is possible to compute $J_F^T\cdot v$ "propagating the vector v backward through $F$").
Unlikely I don't understand properly how can I do that.