Does the joint of the solution of a linear SDE with delayed itself has a PDF? How to compute their mutual entropy?

150 Views Asked by At

Let $Z(t)$ be the solution of a linear SDE $$ dZ_t = AZ_t + \sigma dB_t. $$ with $E(Z_t)=0$. Then $Z_t$ are Gaussian with probability distribution function $$ p(Z_t)= \frac{1}{2 \pi} \frac{1}{\sqrt{det \Sigma(t)}}exp^{-\frac{1}{2} Z^T(t)\Sigma^{-1} Z(t)}, $$ where $\Sigma(t)$ is the covariance matrix (can be obtained using $\sigma$ and spectrum of $A$). Thermodynamic quantities of this object such as entropy can be computed from $\Sigma$ directly.

Now one can also compute the delayed covariance matrix $C(\tau)=<Z(t+\tau)Z(t)>$ as $e^{-A \tau}\Sigma(t)$. I am wondering at this point whether if the probability density function for the variable $(X(t+\tau),X(t))$ exists and whether if this joint variable is Gaussian ($X(t+\tau)$ and $X(t)$ are obviously not independent but put together they are also the solution of a linear SDE of two times the dimension...)? So then next question is how can one compute the mutual entropy of $X(t+\tau), X(t)$ based only on $\Sigma(t)$ and $C(\tau)$? Some people seem to be doing this assuming that $(Z(t+\tau),Z(t))$ has a PDF and sometimes even assuming that they are Gaussian with covariance matrix $C(\tau)$. Is it possible to show this formally?

1

There are 1 best solutions below

5
On BEST ANSWER

You can write the random variable $\mathbf{Z}(t)$ as $$ \begin{align*} \mathbf{Z}(t) = e^{\mathbf{A}(t-t_0)}\mathbf{z}_{t_0} + \int_{t_0}^{t}e^{\mathbf{A}(t-s)}\Gamma(s)ds, \end{align*} $$ where $\Gamma(s)$ is the underlying white-noise process. The same is true of $\mathbf{Z}(t+\tau)$, and indeed we have $$ \begin{align*} \mathbf{Z}(t+\tau) &= e^{\mathbf{A}(t+\tau - t_0) }\mathbf{z}_0+\int_{t_0}^{t+\tau}e^{\mathbf{A}(t+\tau-s)}\Gamma(s)ds \\ &= e^{\mathbf{A}\tau }\mathbf{Z}(t) +\int_{t}^{t+\tau} e^{\mathbf{A}(t + \tau-s)}\Gamma(s)ds. \end{align*} $$ So we see that $\mathbf{Z}(t)$ is given by the linear transformation acting on functions \begin{align*} f(t) \mapsto e^{\mathbf{A}t}\mathbf{z}_0 + \int_{t_0}^{t}e^{\mathbf{A}(t-s)}f(s)ds \end{align*} and so since $\mathbf{Z}(t), \mathbf{Z}(t+\tau)$ are given by a linear transformation of a common Gaussian random variable, $\Gamma(s)$, it follows that they will themselves will have a multivariate Gaussian distribution. Now all that remains is to calculate the parameters of this distribution. The mean and variance are straightforward and we also have \begin{align*} \mathbb{E}\left[ \mathbf{Z}(t+\tau) \mathbf{Z}(t)^{T} \right] &= \mathbb{E}\left[ e^{\mathbf{A}\tau} \mathbf{Z}(t) \mathbf{Z}(t)^T \right] + \mathbb{E}\left[ \int_{t}^{\tau}e^{\mathbf{A}(t+\tau - s)} \Gamma(s) ds \cdot \mathbf{Z}(t)^T\right] \\ &= e^{\mathbf{A}\tau} \mathbb{E}\left[ \mathbf{Z}(t)\mathbf{Z}(t)^T \right] + \underbrace{ \mathbb{E} \left[\int_{t}^{t+\tau} e^{\mathbf{A}(t+\tau - s)}\Gamma(s)ds \right]}_{=0} \cdot \mathbb{E}\left[ \mathbf{Z}(t)^T \right] \\ &= e^{\mathbf{A}\tau}\mathbb{E}\left[ \mathbf{Z}(t)\mathbf{Z}(t)^T \right]. \end{align*} Now you can use these results to calculate the mutual entropy, or any other functional you desire.


To expand on the comments; defining the random variable $$ \mathbf{V} = \int_{t}^{t+\tau} e^{\mathbf{A}(t+\tau - s)}\Gamma(s) ds, $$ then $$ \begin{bmatrix} \mathbf{Z}(t) \\ \mathbf{Z}(t+\tau) \end{bmatrix} = \begin{bmatrix} \mathbf{I} & \mathbf{0} \\ \mathbf{e}^{\mathbf{A}\tau} & \mathbf{I} \end{bmatrix} \begin{bmatrix} \mathbf{Z}(t) \\ \mathbf{V} \end{bmatrix} $$ now it remains to convince yourself that $\mathbf{Z}(t)$ and $\mathbf{V}$ are themselves jointly Gaussian which I will leave you to do, but maybe it will help to declutter things slightly and consider just the univariate case, with initial condition $z_0 = 0$, and writing the finite approximation to $Z_n(t)$ as $$ Z_n(t) = \sum_{t_k < t} e^{a (t - t_k )} \Delta B_{t_k} $$ where $\Delta B_{t_k} = B(t_{k+1}) - B( t_k )$ is the Brownian motion increment process then I hope it seems at least plausible that $$ \begin{align*} Z_n(t+\tau) &= \sum_{t_k < t + \tau} e^{a (t + \tau -t_k) } \Delta B_{t_k} \\ &= e^{a \tau} \left( \sum_{t_i < t } e^{a (t - t_i )}\Delta B_{t_i} + \sum_{t < t_j < t + \tau} e^{a (t - t_j) }\Delta B_{t_j} \right) \\ &= e^{a \tau} \left( Z_{n}(t) + \sum_{t < t_j < t + \tau} e^{a (t - t_j) }\Delta B_{t_j} \right) \end{align*} $$ is suggestive of $Z(t+\tau)$ and $Z(t)$ being jointly Gaussian.