The Taylor expansion of the exponential of a multi-parameter group, and the derivative

122 Views Asked by At

Starting from $U(t)=e^{-itH}$, I can Taylor expand around an infinitesimal transformatio as follows $U(\delta t)\approx 1-i\delta tH$.

Now, say I have a function as follows:

$$ f(x+\delta x) = ( 1-i\delta x H )f(x) $$

I can rewrite the equation as follows:

$$ \frac{f(x+\delta x)-f(x)}{\delta x}=-iH f(x) $$

But the left term is simply the derivative of $f(x)$ with respect to to $x$. So I obtain the definition of the derivative.


My question is what to do in the multi-variable case, with vectors thrown in. Consider instead the following exponential:

$$ G(x,y,t)=e^{- x \pmatrix{1&0 \\0&-1} - y \pmatrix{0&1\\1&0} - t\pmatrix{0&-1\\1&0} } $$

We can write the matrice using the Pauli matrices and the imaginary symbol:

$$ G(x,y,t)=e^{- \sigma_3 x - \sigma_1 y - i t} $$

Its Taylor expansion is as follows:

$$ G(\delta x,\delta y,\delta t) \approx 1 - \sigma_3 \delta x - \sigma_1 \delta y - i \delta t $$

But unlike the previous case, it is not clear to me how I can recover the definition of some multi-variable derivative from this endeavour:

$$ h(x+\delta x,y + \delta y, t+ \delta t)-h(x,y,t)= -\sigma_3 \delta x -\sigma_1\delta y - i \delta t $$

Is that basically a gradient-ish?

1

There are 1 best solutions below

0
On

$ \def\l{\left} \def\r{\right} \def\p{\partial} \def\g#1#2{\frac{\p #1}{\p #2}} \def\gg#1#2{\frac{d #1}{d #2}} \def\m#1{ \left[\begin{array}{c}#1\end{array}\right] } $Define block-matrix analogs of the ${\mathbb R}^{2}$ basis vectors $$ E_1 = \l(e_1\otimes I_n\r) = \m{I_n\\0_n} \qquad E_2 = \l(e_2\otimes I_n\r) = \m{0_n\\I_n} $$ Now suppose that we are given a matrix $Y\in{\mathbb R}^{n\times n}$ and an analytic function $f(z)$.
Further suppose that $Y$ is a function of a scalar parameter $Y=Y(\alpha),\,$ then $F=f(Y)=F(\alpha)$.
Let's use the following subscript notation to denote the derivative with respect to $\alpha$ $$Y_\alpha=\gg{Y}{\alpha} \qquad F_\alpha=\gg{F}{\alpha}$$

Then, using block-triangular matrices, one can write $$\eqalign{ X^{(\alpha)} &= \m{Y&Y_\alpha\\0&Y}, \quad G^{(\alpha)} &= \m{F&F_\alpha\\0&F} \;=\; f\l(X^{(\alpha)}\r) \\ }$$ One can use the $\{E_k\}$ matrices to write algebraic formulas, but they're not as concise $$\eqalign{ X^{(\alpha)} &= E_1YE_1^T+E_2YE_2^T+E_1Y_\alpha E_2^T \\ G^{(\alpha)} &= E_1FE_1^T+E_2FE_2^T+E_1F_\alpha E_2^T \\ }$$ A better use for these matrices is to extract the derivative $$\eqalign{ F_\alpha &= E_1^TG^{(\alpha)}E_2 \\ }$$ If $Y$ depends on several scalar variables, then it has several partial derivatives $$\eqalign{ &Y = Y(\alpha,\beta,\theta) \\ &Y_\alpha=\g{Y}{\alpha} \qquad Y_\beta=\g{Y}{\beta} \qquad Y_\theta=\g{Y}{\theta} \\ }$$ We can write the partial derivatives of $F=f(Y)$ with respect to these parameters as $$\eqalign{ &X^{(\alpha)}=\m{Y&Y_\alpha\\0&Y} \qquad &X^{(\beta)}=\m{Y&Y_\beta\\0&Y} \qquad &X^{(\theta)}=\m{Y&Y_\theta\\0&Y} \\ &F_\alpha=E_1^Tf\!\l(X^{(\alpha)}\r)E_2 \qquad &F_\beta=E_1^Tf\!\l(X^{(\beta)}\r)E_2 \qquad &F_\theta=E_1^Tf\!\l(X^{(\theta)}\r)E_2 \\ }$$ and its total differential (aka first-order Taylor expansion) as $$\eqalign{ dF &= F_\alpha\,d\alpha +F_\beta\,d\beta +F_\theta\,d\theta \\ }$$

The above derivation assumes that you have access to a robust algorithm for calculating $f(X)$.
The alternative is to use a BCH expansion, but I find such calculations to be quite awkward.

That's the general case, but for the Pauli Matrices in your example the situation is very different. These matrices satisfy special commutator relationships such that the whole BCH expansion can be reduced to a simple linear combination of Pauli matrices.