Formulating partial derivative in terms of matrices

406 Views Asked by At

https://www.tensorflow.org/api_docs/python/tf/gradients

I am using function tensorflow.gradients, which takes tensors $y$ and $x$ and returns the partial derivative with respect to $x$. In case of multiple $x$'s, it returns the sum of all partial derivatives $dy/dx$.

But, I have only a matrix $P$ of shape (time_n, freq_n). Here, the first dimension is for temporal information, and the second dimension stores the frequency information. If I want to calculate the partial derivative $dP/dt$ (with respect to first dimension [time]), what will be my $y$ and $x$?

I think, $y = P$, but what will be $x$ and its dimension?

1

There are 1 best solutions below

2
On

Perhaps what you mean is that you have an $n$-dimensional vector of times $t$ and a matrix $P$ such that $tP = \nu$ calculates an $n$-dimensional vector of frequencies $\nu$, and you would like to calculate the gradient of the function $f: t \mapsto tP$ with respect to $t$.

In this case, the gradient is just $P$ itself, and in general the derivative of a linear function is itself. If this is unfamiliar, check that $f_i = \sum t_jP_{ji}$ so that $\nabla f_i = (\dfrac{\partial f_i}{\partial t_j})_i = (P_{ji})_j$. The gradient is then the list of all of these, $(\nabla f_i)_i = ((P_{ji})_j)_i = P$.