Jacobian and determinant of a orthogonal transformation

210 Views Asked by At

Let $P \in \mathbb{R}^{N\times N}$ be an orthogonal matrix and $f: \mathbb{R}^{N \times N} \to \mathbb{R}^{N \times N}$ be given by $f(M) := P^T M P$. I am reading about random matrix theory and an exercise is to calculate the Jacobian matrix of $f$ and its Jacobian determinant.

Question: How to calculate the Jacobian of a matrix-valued function? How is it defined? Somehow the notation here confuses me. I suspect that $Jf = P^T P$ and thus $\det(Jf) = 1\cdot 1 = 1$.

2

There are 2 best solutions below

0
On

I find it always helpful to use the Fréchet derivative approach to these questions. Simply perturbe the input by a small matrix $H$ and you get $$f(M+H)=P^T(M+H)P=P^TMP+P^THP=f(M)+f'(M)(H)$$ where $$f'(M)=(H\mapsto P^THP)$$ is a linear, bounded function. It is easy to see that this must be the derivative since the remainder term is $0$ which trivially fulfills the condition $$lim_{H\to0}\frac{f(M+H)-f(M)-f'(M)(H)}{|H|}=0$$

And as you probably know, you can turn any linear function defined on a finite-dimensional vector space into a matrix by evaluating it on a basis and then decomposing the result with regards to another basis.

Is that answer enough?

(EDIT: Also notice, since $f$ is linear, its derivative at any point $M$ is again $f$. This is similiar to how with linear, real or vector valued functions, their derivatives are the scalar/matrix of the linear transformation they represent.)

0
On

The Jacobian matrix of a transformation is the matrix that represents the better linear approximation of the transformation. In this case $f(M)$ is a linear function of $M$ so the best linear approximation is represented by a matrix $F$ such tat

$ FM=P^TMP$

so: $F= P^TMPM^{-1}$