Characterization of a time-reversible Gauss-Markov process.

256 Views Asked by At

First of all, i'm from an engineering background, so the fundamental mathematics is the hardest part for me of the exercise.

Consider a stochastic process $x:\Omega \times T \in \mathbb{R}^n$ on $ T = \mathbb{Z}$ which is Gaussian, stationary, and has a mean value function which is identically zero.

  1. Prove that if the covariance function admits a representation of the form, $W(t) = (BD_{+-}B^{-1})^t W(0)$, where $D_{+-} = \begin{pmatrix}I_{n_1} & 0\\0 & -I_{n_2} \end{pmatrix},n_1, n_2 \in \mathbb{N}$, $n_1+n_2 =n,$ and $W(0), B\in\mathbb{R}^{n\times n}$ both nonsingular matrices, that then the process is both a Markov process and time reversible process.
  2. Prove that if the process is Markov and time reversible that then the covariance function admits a representation as described in part 2.

I'm still stuck on the first exercise. From different propositions in the textbook/lecture notes ( can be found here, careful it's a document of 500+ pages ), I've boiled it down to this, but I think I'm missing some feeling with linear algebra.

Since the Gaussian process is stationary, we can use the property derived from proposition C.3.2 in the way it is stated on page 398. Which states that the covariance function $W$ is \textit{para-symmetric}, or $W(t) = W(-t)^T \forall t \in T$. Furthermore by proposition C.3.4 we can conclude that the process as provided is a time reversible process. Where the time reversibility condition implies that $W(t) = W(-t) \forall t \in T$, which could also be deduced from the fact that $x$ is a Gaussian process for which the covariance function should be symmetric ($W(t) = W(t)^T$), combining this with the above stated proposition gives us automatically $W(t) = W(-t)^T = W(-t)$ as was deduced from proposition C.3.4.

How does the fact that $(BD_{+-}B^{-1})^t$ now imply that the system is a Markov process (or as stated in the second question, how does a Markov process imply $(BD_{+-}B^{-1})^t$)?


I think this might be useful, but I can't see how:

Where a proposition (C.3.6) in the book provides:

Let $x : Ω × T → \mathbb{R}^n$ be a Gaussian process with $T = \mathbb{N}, x(t) ∈ G(0,Q_x(t))$ and covariance function $W : T ×T → \mathbb{R}^{n×n}$. Assume that for all $t ∈ T, Q_x(t) > 0$. The following statements are equivalent:

  • The process x is a Markov process
  • The covariance function $W$ satisfies $W(t,s) = W(t,u)W(u,u)^{−1}W(u,s), \forall s,u,t \in T \text{ such that } s<u<t$.

The proofs of all propositions are in the lecture notes, but I have ommitted them for clarity.

*EDIT: In the comments I have found something new *

1

There are 1 best solutions below

0
On BEST ANSWER

My final answer to the question:

Since the Gaussian process is stationary, we can use the property derived from proposition C.3.2 in the way it is stated on page 398. Which states that the covariance function $W$ is \textit{para-symmetric}, or $W(t) = W(-t)^T \forall t \in T$. Furthermore by proposition C.3.4 we can conclude that the process as provided is a time reversible process. Where the time reversibility condition implies that $W(t) = W(-t) \forall t \in T$, which could also be deduced from the fact that $x$ is a Gaussian process for which the covariance function should be symmetric ($W(t) = W(t)^T$), combining this with the above stated proposition gives us automatically $W(t) = W(-t)^T = W(-t)$ as was deduced from proposition C.3.4.\

Since we have some properties as described above, we can deduce that $W(t)$ has some rather special traits by which we show it is a Markov process. Let's call $ P = (BD_{+-}B^{-1})$.

If matrices satisfy the form $A = U \tilde{D} U^{-1}$, where $\tilde{D}$ is a signaturematrix (of the form $\tilde{D} = \left(\begin{smallmatrix} \pm 1 & 0 &\dotsb & 0\\0 & \pm 1 & \dotsb & 0\\ \vdots & \vdots & \ddots & \vdots\\ 0 & 0 & \dotsb & \pm 1\end{smallmatrix}\right)$), then $A^m = I$ for even powers $m$, and $A^p = A$ for uneven powers $p$. This makes our matrix $P$ an involutory matrix. This can easily be shown for our case:

\begin{equation} \label{eq:Mar1} \begin{array}{lclc} W(t) & = & W(-t) & \Leftrightarrow \\ P^t W(0) & = & P^{-t} W(0) & \text{for } t=1 \\ P W(0) & = & P^{-1} W(0) & \text{with } W(0) \text{ non-singular} \\ P W(0)W(0)^{-1} & = & P^{-1} W(0)W(0)^{-1} & \Leftrightarrow \\ P & = & P^{-1} \end{array} \end{equation} This in turn implies the following: \begin{equation} \label{eq:w2} W(2) = P^2 W(0) = P P W(0) = P^{-1} P W(0) = W(0) \end{equation} Where for any even $t = m = 2n, \forall n \in \mathbb{N}$ this implies that the covariance can be split up in: \begin{equation} W(m) = P^m W(0) = P^{2n} W(0) = \left(P^{-1}P\right)^n W(0) = W(0) \end{equation} And for any uneven $t = p = 2n + 1, \forall n \in \mathbb{N}$ this then implies that the covariance can be written as: \begin{equation} W(p) = P^p W(0) = P P^{2n} W(0) = P \left(P^{-1}P\right)^n W(0) = P W(0) \end{equation} Now using the definition of a Gauss-Markov process in proposition C.3.4 gives us the expression as can be seen in equation (\ref{eq:pw}), which might be a bit cumbersome to read, but in essence it proves that a $W(t,s)$ which can be rewritten in the form $W(t+s,s)$, which follows from the \textit{para-symmetric} property as described above. This means that the forms of proposition C.3.4 can be rewritten as $W(u,u) = W(0)$, $W(t,s) = W(\tilde{t}_{t,s})$ (where $\tilde{t}_{t,s} = t-s$), $W(u,s) = W(\tilde{t}_{u,s})$ (where $\tilde{t}_{u,s} = u-s$), and $W(t,u) = W(\tilde{t}_{t,u})$ (where $\tilde{t}_{t,u} = t-u$). \begin{multline} \label{eq:pw} W(t,s) = W(\tilde{t}_{t,s}) = W(t-s) = W(\tilde{t}_{t,u})W(0)^{-1}W(\tilde{t}_{u,s}) = \\ W(t-u)W(0)^{-1}W(u-s) = P^{t-u}W(0)W(0)^{-1}P^{u-s}W(0) = \\ P^{t-u}P^{u-s}W(0) =P^{t-s}W(0) \end{multline} In short $W(t,s) = W(t-s) = P^{t-s}W(0)$, which in the above equation has be shown that it was written in the form of proposition C.3.4. Combining all the properties show that the process as provided in the question is a stationary, time reversible, Gauss-Markov process.