Is the given sequence a Markov chain or not? If yes, find its transition probabilities.

604 Views Asked by At

Let $\xi_n$, $n \in \mathbb{Z}_+$ be a sequence of i.i.d random variables over $\mathbb{R}$ with the density $p(x)$. Consider the sequence $$ \eta_n := \sum\limits_{k=1}^n\left(a\xi_k + b\xi_{k + 2}\right), \quad a, b\in\mathbb{R} $$

  1. Is it a Markov chain?
  2. If yes, find its transition probabilities.

While trying to solve this problem, I first noticed that $$ \eta_n = \eta_{n - 1} + a\xi_n + b\xi_{n + 2} $$ As far as I understand, to prove the statement, I have to show that Markov property is satisfied (or show the opposite): $$ \mathbb{P}\left(\eta_n = x_n\mid\eta_{n - 1} = x_{n - 1}, \ldots, \eta_1 = x_1\right) = \mathbb{P}\left(\eta_n = x_n\mid\eta_{n - 1} = x_{n - 1}\right) $$ And here I am stuck and need help - I don't know how to move further. The only idea was to rewrite the equation above as $$ \mathbb{P}\left(a\xi_n + b\xi_{n + 2} = x_n - \eta_{n - 1}\mid \eta_{n - 1} = x_{n - 1}, \ldots, \eta_1 = x_1\right) $$ and somehow use the independence of $\xi_i$. However, I'm confused with these large sums of random variables and I'm not sure that it is a correct way to go.

If we assume that the given sequence is a Markov chain, we have to find transitional probabilities. I guess that I can use the fact that $$ p_{c\xi}(x) = \frac{1}{|c|}p_\xi\left(\frac{x}{|c|}\right), \quad c\in \mathbb{R} $$ and find transitional probabilities as the value of the density of the sum $a\xi_n + b\xi_{n + 2}$ in the corresponding point: \begin{equation} \mathbb{P}\left(\eta_n = j\mid \eta_{n - 1} = i\right) = \mathbb{P}\left(a\xi_n + b\xi_{n+2} = j - i\right) = p_{a\xi_n + b\xi_{n + 2}}(j - i) =\\ = \int\limits_{-\infty}^\infty\!p_{a\xi_n}(j - i - y)p_{b\xi_{n+2}}(y)\,dy = \frac{1}{|ab|}\int\limits_{-\infty}^\infty\!p\left(\frac{j - i - y}{|a|}\right)p\left(\frac{y}{|b|}\right)\,dy \end{equation} Am I right about this?

1

There are 1 best solutions below

2
On BEST ANSWER

Intuitively, the $\xi_{k+2}$ term makes $\eta_n$ related to $\eta_{n-2}$ in a way that one cannot deduce just by observing $\eta_{n-1}$. Here is an example exploiting this fact to show that $\eta_n$ is not a Markov chain:

  • $a = b = 1$

  • Every $\xi_k$ is i.i.d. coin-flip resulting in $0$ or $1$ with equal probability $p= 1/2$.

    • I know the OP says there is a density $p_\xi(x)$, so just pick something highly bimodal, e.g. 50-50 mix of $U(0,0.01)$ and $U(1, 1.01)$, and the rest of the argument still goes through after minor modifications...

First we condition on $\eta_2 = 3$:

  • Since $\eta_2 = \xi_1 + \xi_2 + \xi_3 + \xi_4 = 3$, we know $3$ out of $4$ coin-flips give a $1$, so $P(\xi_3 = 1 \mid \eta_2 = 3) = {3 \over 4}$.

  • Now $\eta_3 = \eta_2 + \xi_3 + \xi_5 = 3 + \xi_3 + \xi_5$, so:

$$P(\eta_3 = 3 \mid \eta_2 = 3) = P(\xi_3 = 0 \mid \eta_2 = 3)P(\xi_5 = 0) = {1 \over 4} \cdot {1 \over 2} = {1\over 8}$$

Next, we condition on the same $\eta_2 = 3$, but also on $\eta_1 = 2$:

  • $\eta_1 = \xi_1 + \xi_3 = 2 \Rightarrow \xi_1 = \xi_3 = 1$.

  • So $\eta_3 = \eta_2 + \xi_3 + \xi_5 = 3 + 1 + \xi_5 \neq 3$, i.e.:

$$P(\eta_3 = 3 \mid \eta_2 = 3, \eta_1 = 2) = 0 \ \ \ \neq \ \ \ P(\eta_3 = 3 \mid \eta_2 = 3) = {1 \over 8}$$

This directly proves $\eta_n$ is not a Markov chain.

Aside: In fact, I think even if we allow "remembering" a finite number $m$ of $\eta$'s, i.e. $\theta_n = (\eta_n, \eta_{n+1}, \dots, \eta_{n+m-1})$, the $\theta_n$'s still do not form a Markov chain because there is still information in $\theta_{n-2}$ that is not available in $\theta_{n-1}$. Similarly if we just take the even (or odd) indices $\phi_m = \eta_{2m}$ the subsequence $\phi$ also does not form a Markov chain.


For general $a,b,\xi_k$, it might be easier to show that the condition expectations are different. Let me take the liberty to assume that $E[\xi_k] = \gamma$ exists (is finite). Then $\eta_3 = \eta_2 + \xi_3 + \xi_5$ implies:

  • $E[\eta_3 \mid \eta_2 = x] = x + \gamma + E[\xi_3 \mid \eta_2 = x]$

  • $E[\eta_3 \mid \eta_2 = x, \eta_1 = y] = x + \gamma + E[\xi_3 \mid \eta_2 = x, \eta_1 = y] = x + \gamma + E[\xi_3 \mid \eta_1 = y]$

    • because $(\eta_2 = x, \eta_1 = y) \equiv (\xi_2 + \xi_4 = x-y, \eta_1 = y)$ and $\xi_3$ is independent of $\xi_2, \xi_4$.

So the two conditional expectations are different (proving $\eta_n$ is not a Markov chain) iff $E[\xi_3 \mid \eta_2 = x] \neq E[\xi_3 \mid \eta_1 = y]$. I'm not sure how to show that in general but for specific $p_{\xi}(x)$ it should be easy to pick one example where they are unequal.

(In particular if $a=b$ then due to the i.i.d. property of the $\xi_k$'s we have $E[\xi_3 \mid \eta_2 = x] = {x\over 4a}$ while $E[\xi_3 \mid \eta_1 = y] = {y \over 2a}$.)