What does this dynamic system represent for?

111 Views Asked by At

I know systems like $$\frac{dx}{dt}=Sx$$ where $S$ is a symmetric matrix admit a solution that dialates along eigendirection of $S$.

And systems like $$\frac{dx}{dt}=Ax$$ where $A$ is a skew-symmetric matrix admit a solution that rotates and keeps some quantity, say $x^2$ invariant along the trajectory.

I wanna know is there any system such that $$\frac{dx}{dt}=[S,A]x=(SA-AS)x$$ What does this kind of system represent for? Is there any physical correspondence or characteristics to this kind of system?

1

There are 1 best solutions below

0
On BEST ANSWER

I take it from the overall context of the question that we're working over the real field $\Bbb R$, since it is in the real case that terms like symmetric and skew-symmetric are customarily used. Furthermore, let us assume that the dimension of the underlying vector space upon which $A$ and $S$ operate is $N$, so that $x \in \Bbb R^N$ and $S$, $A$ are real $N \times N$ matrices. These things being said:

Assuming that $S$ is symmetric,

$S^T = S, \tag{1}$

and $A$ is skew-symmetric,

$A^T = -A, \tag{2}$

we have that $[S, A]$ itself is in fact symmetric:

$[S, A]^T = (SA - AS)^T = (SA)^T - (AS)^T = A^TS^T - S^TA^T = -AS + SA = [S, A], \tag{3}$

where use has been made of (1) and (2). Thus the system

$\frac{dx}{dt} = [S, A]x \tag{4}$

falls into the first general category the OP introduced, viz.

$\frac{dx}{dt} = Sx, \tag{5}$

$S$ satisfying (1). As such, the eigenvalues $\lambda_i$ of $[S, A]$ are all real, and there is on orthogonal eigenbasis $e_i$ of $[S, A]$:

$[S, A]e_i = \lambda_i e_i. \tag{6}$

Under these circumstances we may affirm that the general solution of (4) is given by

$x(t) = \sum_1^N x_i(t_0)e^{\lambda _i(t- t_0)}e_i, \tag{7}$

where

$x(t_0) = \sum_1^N x_i(t_0)e_i \tag{8}$

is the expansion of $x(t_0)$ in terms of the eigenvectors $e_i$. Here $t_0$ is the initial time. The system (4) will typically be unstable, unless all the $\lambda_i \le 0$, a case we shall address in the following.

We can fact say more: it is well-known that the trace of any Lie bracket vanishes: this may easily be seen by direct calculation, since

$\text{Tr} (AS) = \sum_{i= 1}^{i = N} \sum_{j = 1}^{j = N} A_{ij} S_{ji} = \sum_{j = 1}^{j = N} \sum_{i = 1}^{i = N} S_{ji} A_{ij} = \text{Tr} (SA), \tag{9}$

whence

$\text{Tr} ([S, A]) = \text{Tr} (SA - AS) = \text{Tr} (SA) - \text{Tr} (AS) = 0. \tag{10}$

Now since

$0 = \text{Tr} ([S, A]) = \sum_{i = 1}^{i = N} \lambda_i ,\tag{11}$

we may conclude that, except in the special case $\lambda_i = 0$ for all $i$, some of the eigenvalues will be positive and some will be negative; the origin will be a generalized saddle; the trajectories will for the most part take the form of genereralized hyperbolas, heading inwards awhile but then turning outwards and forever moving away from $0$, becoming unbounded as $t \to \infty$. Of course, if all the $\lambda_i = 0$, then $[S, A]$, being symmetric, itself must vanish; $S$ and $A$ commute: $SA = AS$.

I have no idea which physical systems can be modelled on this paradigm, though it would not surprise me if some are.

And about such equations I can say no (well, not much anyway) more at present.

In closing, I think it is worth observing that, for $N \times N$ real matrices $P$ and $Q$:

$P^T = P, \, Q^T = Q \Rightarrow [P, Q]^T = -[P, Q]; \tag{12}$

$P^T = -P, \, Q^T = -Q \Rightarrow [P, Q]^T = -[P, Q]; \tag{13}$

$P^T= P, \, Q^T =-Q \Rightarrow [P, Q]^T = [P, Q], \tag{14}$

all of which follow easily from the definitions.

Hope this helps. Cheers, and as always,

Fiat Lux!!!