How to derive the Jacobian of an operator

187 Views Asked by At

I have an operator defined by $\displaystyle{H(P)=P^{2}=PP}$. I want to formulate the superoperator of this operator, in this case, we let the superoperator be the operator derivative. By the literature, the “operator derivative” is simply the Jacobian matrix of the function (of operators) where one simply treats the operator input and output as vectors and expands the space of operators in some basis.

I have an example of the triple operator given by $\displaystyle{H(P)=P^{3}=PPP}$, which when subjected to the derivative operator yields:

${\displaystyle {\frac {\Delta H}{\Delta P}}[X]=XP^{2}+PXP+P^{2}X}$

Any term with $P^2$ appears related to the derivative with respect to P of $P^3$, but this is just a guess. I am not sure.

How did they get this?

2

There are 2 best solutions below

6
On BEST ANSWER

The easiest way to obtain results like this is to compute $\frac{d}{d\epsilon}H(P + \epsilon V)$ to obtain the directional derivative in the $V$ direction, then manipulate this expression for the directional derivative to obtain a linear map that reproduces the directional derivative for any direction $V$, which is the Jacobian at $P$. The first part of this procedure can be shortcut by simply computing $H(P+V)$ then only taking terms linear in $V$.

For this problem, we have $$ H(P+V) = (P+V)^3 = (P+V)(P+V)(P+V) = P^3 + VP^2 + PVP + P^2V + \mathcal{O}(V^2). $$ The linear term is the directional derivative of $H$ at $P$ in the direction of $V$, which is given by $VP^2 + PVP + P^2V$ and matches what you wanted. Sometimes it is possible to manipulate this equation into an explicit linear map acting on $V$, but with matrix-valued functions sometimes you just have to leave it like this.

5
On

In my experience the approach of whpowell96's answer is the most efficient way to do these calculations.

However it is sometimes helpful to know how to do these calculations by acting on the components of your operators in a given basis. Note that I am going to assume we are working in a finite dimensional vector space, as it is not always possible to do calculations this way in an infinite dimensional vector space.

Let $e_1, \dots, e_n$ be an orthonormal basis for our vector space. Then we can express our operators $H$ and $P$ in matrix form according to the following,

$$ H_{ij} = <e_i, H e_j> \qquad P_{ij} = <e_i, P e_j>$$

Our function $H(P)=P^3$ can be written in component form as the following,

$$ H_{ij} = \sum_k \sum_l P_{ik} P_{kl} P_{lj}$$

Thinking of each component, $P_{ab}$, as an independent variable we differentiate $H$ with respect to $P$ by evaluating the partial derivative $\partial H_{ij} / \partial P_{ab}$.

$$ \frac{\partial H_{ij}}{\partial P_{ab}} = \sum_k \sum_l \frac{\partial }{\partial P_{ab}} \left( P_{ik} P_{kl} P_{lj}\right)$$

$$ = \sum_k \sum_l \left( \frac{\partial P_{ik}}{\partial P_{ab}} P_{kl} P_{lj} + P_{ik} \frac{\partial P_{kl}}{\partial P_{ab}} P_{lj} + P_{ik} P_{kl} \frac{\partial P_{lj}}{\partial P_{ab}} \right)$$

$$ = \sum_k \sum_l \left( \delta_{ia} \delta_{kb} P_{kl} P_{lj} + P_{ik} \delta_{ka}\delta_{lb} P_{lj} + P_{ik} P_{kl} \delta_{la}\delta_{jb} \right)$$

$$ = \sum_k \sum_l\delta_{ia} \delta_{kb} P_{kl} P_{lj} + \sum_k \sum_lP_{ik} \delta_{ka}\delta_{lb} P_{lj} + \sum_k \sum_lP_{ik} P_{kl} \delta_{la}\delta_{jb} $$

$$ = \delta_{ia} \sum_l P_{bl} P_{lj} + P_{ia} P_{bj} + \delta_{jb} \sum_k P_{ik} P_{ka} $$

$$ \boxed{\frac{\partial H_{ij}}{\partial P_{ab}} = \delta_{ia} (P^2)_{bj} + P_{ia} P_{bj} + \delta_{jb} (P^2)_{ia} } $$

Now note what happens when we apply this operator to $X_{ab}$

$$ \frac{\partial H_{ij}}{\partial P_{ab}} \left(X_{ab} \right) = \sum_a \sum_b \frac{\partial H_{ij}}{\partial P_{ab}} X_{ab} $$

$$= \sum_a \sum_b \delta_{ia} (P^2)_{bj} X_{ab} + \sum_a\sum_b P_{ia} P_{bj} X_{ab} + \sum_a \sum_b \delta_{jb} (P^2)_{ia} X_{ab} $$

$$= \sum_b X_{ib} (P^2)_{bj} + \sum_a\sum_b P_{ia} X_{ab} P_{bj} + \sum_a (P^2)_{ia} X_{aj} $$

$$= (XP^2)_{ij} + (PXP)_{ij} + (P^2X)_{ij} $$