Moments of Markov chains.

82 Views Asked by At

I am working with a discrete Markov chain $X_0,X_1,...$ with state space $(-1,1)$ and transition matrix. \begin{align} P = \begin{bmatrix} 1-\alpha & \alpha \\ \beta & 1-\beta \end{bmatrix}, \end{align} where $\alpha, \beta \in (0,1)$. This Markov chain has a steady-state distribution $\left(\pi_{0}, \pi_{1}\right) = \left( \frac{\beta}{\alpha+\beta}, \frac{\alpha}{\alpha + \beta}\right)$. I am looking for a simple way to calculate moments of the form \begin{align} \mathbb{E} \left[X_i X_j X_k X_l \right] \end{align} under the assumption of stationarity, where $i < j < k < l$. I am aware that these moments can be calculated by conditioning in turn on $X_i$, $X_j$, $X_k$ and $X_l$, but this requires summing 16 products of 4 transition probabilities each. I was hoping there is a more elegant way to do it.