Perturbation expansion of matrix Riccati equation

84 Views Asked by At

I have a Riccati equation where $\Theta$ (and all other non mathcal terms) are 2x2 matrices, which has the following form: $$\dot \Theta(t) =\alpha \mathcal R (\Theta(t)) + \mathcal L (\Theta(t))$$ where $$\mathcal L (\Theta)=\Theta W^{\dagger }+W\Theta+F $$ and $$\mathcal R (\Theta)=\Theta X\Theta+ \Theta W_x^{\dagger }+W_x\Theta +F_x$$

This technically has a closed form solution (since one can reduce it to an eigenvalue-vector problem for the Hamiltonian for the algebraic version, and from there convert the algebraic solution into a Lyapunov equation, which itself has a closed form solution) but it happens that the solution is too complicated (since it involves roots of a 4th order polynomial) and therefore are not particularly useful since it's hard to get any physical insight from it. It happens that I don't need to full solution, and instead I need only the solution for a Riccati equation where $\alpha$ is small (to be exactly accurate, I need the first few terms of a perturbation expansion for small $\alpha$). Here we can assume that there exists a steady state solution. We can also assume that the case $\alpha=0$ has an easy closed form solution.

I am also interested in cases where $\Theta$ is bigger, and therefore there probably would not be a closed form solution to the problem non-perturbatively.

I noticed that there are some perturbative solution to the algebraic problem. Is there one for the time dependent one? And is having the solution for the case $\alpha=0$ enough to solve the general case perturbatively?

1

There are 1 best solutions below

3
On BEST ANSWER

Let $\Theta(\alpha, t) = \sum_{j=0}^\infty \alpha^j \Theta_j(t)$. Note that $$\mathcal L(\Theta) = \sum_{j=0}^\infty \alpha^j (\Theta_j W^\dagger + W \Theta_j) + F $$ $$ \mathcal R(\Theta) = \sum_{j=0}^\infty \alpha^j \sum_{k=0}^j \Theta_k X \Theta_{j-k} + \sum_{j=0}^\infty \alpha^j (\Theta_j W_x^\dagger + W_x \Theta_j) + F_x $$ You get the differential equation for $\Theta_j$ by expanding the original differential equation and taking the coefficient of $\alpha^j$. Thus for $j = 0$ we have just $$ \dot{\Theta}_0 = \mathcal L(\Theta_0) = \Theta_0 W^\dagger + W \Theta_0 + F $$ while for $j = 1$ it's $$ \dot{\Theta}_1 = \Theta_1 W^\dagger + W \Theta_1 + \Theta_0 X \Theta_0 + \Theta_0 W_x^\dagger + W_x \Theta_0 + F_x$$ In general the differential equation for $\Theta_j$ will be of the form $$ \dot{\Theta}_j = \Theta_j W^\dagger + W \Theta_j + (\text{a quadratic polynomial in}\ \Theta_0, \ldots, \Theta_{j-1})$$

You didn't mention initial conditions, but presumably you want the initial condition for $\Theta(\alpha,t)$ to be the same as for $\Theta_0$, so $\Theta_j(0) = 0$ for $j \ge 1$.