Stationary solution multidimensional Fokker-Planck equation/PDE

1.2k Views Asked by At

Does anyone have a clever method for finding the solution to this PDE? $$0=\frac{\partial}{\partial x}\left((Ax+C(x-y))p(x,y)\right)+\frac{\partial}{\partial y}\left((By+C(y-x))p(x,y)\right)+D\left[\frac{\partial^2}{\partial x^2}+\frac{\partial^2}{\partial y^2}+2E\frac{\partial^2}{\partial x\partial y}\right]p(x,y)$$ for $p(x,y)$ given that $A,B,C,D,E$ are all positive constants with $0<E<1$. It is trivial for $E=0$ but I don't know how to deal with the cross derivative.

1

There are 1 best solutions below

2
On BEST ANSWER

This is a linear Fokker-Planck equation, so the solution is a Gaussian. The only parameters, the averages (which are zeros in this case) and variances, can be found by brute force, but the following method see for example van Kampen, page 212 is somewhat easier.

We will change the notation a bit for convenience $$ \sum_{i,j=1}^2 \frac{ \partial }{ \partial x_i } (A_{ij} x_j P) + B_{ij} \frac{\partial^2}{\partial x_i \partial x_j} P = 0. \qquad (1) $$

Let us first compute the Fourier transform (characteristic function) $$ G(k_1, k_2) = \int_{-\infty}^\infty \int_{-\infty}^\infty P(x_1, x_2) \, \exp\left(\sum_{i=1}^2 i k_i x_i\right) \, dx_1 \, dx_2. $$ Now by multiplying $\exp\left(\sum_{i=1}^2 i k_i x_i\right)$ and integrating by parts, we get $$ -\sum_{i,j=1}^2 k_i A_{ij} \frac{ \partial G } { \partial k_j } -k_i \, k_j \, B_{ij} \, G = 0. $$ At this point, we can impose that the generating function is Gaussian, and $$ \log G(k_1, k_2) = -\frac{1}{2} \sum_{i,j=1}^2 k_i \Xi_{ij} k_j, $$ where $\Xi_{ij}$ is the symmetric covariance matrix of $\operatorname{cov}(x_i, x_j)$. This quadratic form can also be readily shown by using the method of characteristics. Thus $$ \sum_{i,j,l=1}^2 k_i \, A_{ij} \, \Xi_{jl} \, k_l = \sum_{i,l} k_i \, B_{il} \, k_l. $$ Since this equation holds for all values of $k_1$ and $k_2$, and $B_{ij}$ is symmetric, we must have $$ \sum_{j = 1}^2 A_{ij} \Xi_{jl} + \Xi_{ij} A_{lj} = 2 \, B_{il}. $$ Or in matrix form $$ \mathbf{ A \Xi} + \mathbf{\Xi A}^T = 2 \, \mathbf B. \qquad (2) $$ Now this is the Lyapunov equation, and the solution (the integral form) is $$ \mathbf \Xi = 2 \int_0^\infty \exp(-\mathbf A \, t) \, \mathbf B \, \exp(-\mathbf A^T \, t) \, dt. \qquad (3) $$ In our case, this expression converges because $\mathbf A$ is positive-definite ($\operatorname{tr}\mathbf A = A + B + 2 C > 0$, and $\operatorname{det}\mathbf A = AB+BC+CA > 0$). And we can verify it directly $$ \begin{aligned} \mathbf {A \Xi + \Xi A}^T &= -2\int_0^\infty \left[ \frac{d}{dt} \left[\exp(-\mathbf A t) \right]\, \mathbf B \, \exp(-\mathbf A^T t) + \exp(-\mathbf A t) \, \mathbf B \, \frac{d}{dt} \exp(-\mathbf A^T t) \right] \, dt \\ &= -2\exp(-\mathbf A t) \, \mathbf B \, \exp(-\mathbf A^T t) \Bigg|_0^\infty = 2 \, \mathbf B. \end{aligned} $$

In summary, the distribution is a two-dimensional Gaussian, $$ P(\mathbf x) = \frac{1}{2 \pi \sqrt{\operatorname{det} \mathbf \Xi}} \exp\left( - \frac{1}{2} \mathbf x \mathbf \Xi^{-1} \mathbf x^T \right). $$ with the covariance matrix $\mathbf \Xi$ given by (3).


Edit. There is a simpler solution. Since the solution is a Gaussian centered at zero, we only need to determine the covariant matrix, which can done as follows.

Multiplying (1) by $x_r x_s$ and integrating over $dx_1 dx_2$, we get $$ -\int_{-\infty}^\infty\int_{-\infty}^\infty \left[ \sum_{j=1}^2 \left( x_s A_{rj}x_j + x_r A_{sj} x_j \right) +(B_{rs} + B_{sr}) \right]\, P \, dx_1 \, dx_2 = 0. $$ Since $\mathbf B$ is symmetric, this means $$ \sum_{j=1}^2 \left[ A_{rj} \operatorname{cov}(x_j, x_s) + \operatorname{cov}(x_r, x_j) A_{sj} \right] + 2 B_{rs} = 0, $$ which is just the component form as (2). Then the solution (3) follows.