Let $P \in \mathbb{R}^{N\times N}$ be an orthogonal matrix and $f: \mathbb{R}^{N \times N} \to \mathbb{R}^{N \times N}$ be given by $f(M) := P^T M P$. I am reading about random matrix theory and an exercise is to calculate the Jacobian matrix of $f$ and its Jacobian determinant.
Question: How to calculate the Jacobian of a matrix-valued function? How is it defined? Somehow the notation here confuses me. I suspect that $Jf = P^T P$ and thus $\det(Jf) = 1\cdot 1 = 1$.
I find it always helpful to use the Fréchet derivative approach to these questions. Simply perturbe the input by a small matrix $H$ and you get $$f(M+H)=P^T(M+H)P=P^TMP+P^THP=f(M)+f'(M)(H)$$ where $$f'(M)=(H\mapsto P^THP)$$ is a linear, bounded function. It is easy to see that this must be the derivative since the remainder term is $0$ which trivially fulfills the condition $$lim_{H\to0}\frac{f(M+H)-f(M)-f'(M)(H)}{|H|}=0$$
And as you probably know, you can turn any linear function defined on a finite-dimensional vector space into a matrix by evaluating it on a basis and then decomposing the result with regards to another basis.
Is that answer enough?
(EDIT: Also notice, since $f$ is linear, its derivative at any point $M$ is again $f$. This is similiar to how with linear, real or vector valued functions, their derivatives are the scalar/matrix of the linear transformation they represent.)