Writing $\mathbb{P}(X_n - X_{n-1} = 1 \mid X_{n-1} = x)$ as $\mathbb{P}(X_n - x = 1)$?

59 Views Asked by At

Suppose we have a Markov chain $\{X_n; n \geq 0\}$ where each variable takes on the states $S_X = \{0, 1, 2\}$.

I am unsure about which of the following is correct:

\begin{align} \mathbb{P}(X_n - X_{n-1} = 1 \mid X_{n-1} = x) &= \mathbb{P}(X_n - x = 1 \mid X_{n-1} = x) \tag{1} \\ \mathbb{P}(X_n - X_{n-1} = 1 \mid X_{n-1} = x) &= \mathbb{P}(X_n - x = 1) \tag{2} \end{align}

Both seem plausible to me. Even though I suspect $(1)$ is correct, I cannot justify why; intuitively, it seems that replacing $X_{n-1}$ by $x$ makes the condition $X_{n-1} = x$ redundant.

Which equality is correct and how can we give a convincing justification?

2

There are 2 best solutions below

0
On BEST ANSWER

\begin{align} \Pr(X_n - X_{n-1} = 1 \mid X_{n-1} = x) &= \Pr(X_n - x = 1 \mid X_{n-1} = x) \tag{1} \\ \Pr(X_n - X_{n-1} = 1 \mid X_{n-1} = x) &= \Pr(X_n - x = 1) \tag{2} \end{align}

That the second one cannot be right is seen by an example: Suppose the transitions $0\to1\to2\to0$ each have probability $0.99$ and the transitions $0\to2\to1\to0$ each have probability $0.01.$

Assuming we start with the stationary distribution, which assigns probability $1/3$ to each of $0,1,2,$ then we have \begin{align} & \Pr( X_n-X_{n-1}=1\mid X_{n-1}=0)=0.99 = \Pr(X_n-0=1\mid X_{n-1}=0) \\[6pt] \text{and } & \Pr( X_n-X_{n-1}=1\mid X_{n-1}=0)=0.99 \ne 1/3 = \Pr(X_n-0=1). \end{align}

0
On

The first choice is correct.

Applying the formula for conditional probability $$ P(A{\,{\large{\mid}}\,} B)=\frac{P(A\land B)}{P(B)} $$ we get \begin{align*} & P\Bigl(\bigl(X_n-X_{n-1}=1\bigr) {\,{\large{\mid}}\,} \bigl(X_{n-1}=x\bigr)\Bigr) \\[6pt] =\;& \frac{P\Bigl(\bigl(X_n-X_{n-1}=1\bigr) \land \bigl(X_{n-1}=x\bigr)\Bigr)}{P(X_{n-1}=x)} \\[6pt] =\;& \frac{P\Bigl(\bigl(X_n-x=1) \land \bigl(X_{n-1}=x\bigr)\Bigr)}{P(X_{n-1}=x)} \\[6pt] =\;& P\Bigl(\bigl(X_n-x=1) {\,{\large{\mid}}\,} \bigl(X_{n-1}=x\bigr)\Bigr) \\[6pt] \end{align*} which validates the first choice.

As an example to show that the second choice can fail, consider the Markov chain with transition matrix $$ \begin{array}{c|c|c|c|} {\vphantom{x_{X_1}}}&0&1&2\\ \hline 0&{\vphantom{\dfrac{x}{X}}}{\large{\frac{1}{2}}}&{\large{\frac{1}{4}}}&{\large{\frac{1}{4}}}\\ \hline 1&{\vphantom{\dfrac{x}{X}}}{\large{\frac{1}{4}}}&{\large{\frac{1}{2}}}&{\large{\frac{1}{4}}}\\ \hline 2&{\vphantom{\dfrac{x}{X}}}{\large{\frac{1}{4}}}&{\large{\frac{1}{4}}}&{\large{\frac{1}{2}}}\\ \hline \end{array} $$ where for $i,j\in \{0,1,2\}$, the entry $p(i,j)$ in row $i$, column $j$ is the probability of a $1$-step transition from state $i$ to state $j$.

Now suppose $X_0$ takes values $0,1,2$ with equal likelihood.

Then for $x=0$ we get \begin{align*} & P\Bigl(\bigl(X_1-X_0=1\bigr){\,{\large{\mid}}\,}\bigl(X_0=x\bigr)\Bigr) \qquad\;\;\;\;\; \\[4pt] =\;& P\Bigl(\bigl(X_1=1\bigr){\,{\large{\mid}}\,}\bigl(X_0=0\bigr)\Bigr) \\[4pt] =\;& p(0,1) \\[4pt] =\;& {\small{\frac{1}{4}}} \\[4pt] \end{align*} whereas \begin{align*} &P\bigl(X_1-x=1\bigr) \\[4pt] =\;& P\bigl(X_1=1\bigr) \\[4pt] =\;& P\bigl(X_0=0\bigr)P\Bigl(\bigl(X_1=1\bigr){\,{\large{\mid}}\,}\bigl(X_0=0\bigr)\Bigr) \\[0pt] &+ P\bigl(X_0=1\bigr)P\Bigl(\bigl(X_1=1\bigr){\,{\large{\mid}}\,}\bigl(X_0=1\bigr)\Bigr) \\[0pt] &+ P\bigl(X_0=2\bigr)P\Bigl(\bigl(X_1=1\bigr){\,{\large{\mid}}\,}\bigl(X_0=2\bigr)\Bigr) \\[4pt] =\;& {\small{\frac{1}{3}}}{\,\cdot\,}p(0,1) + {\small{\frac{1}{3}}}{\,\cdot\,}p(1,1) + {\small{\frac{1}{3}}}{\,\cdot\,}p(2,1) \\[4pt] =\;& \Bigl({\small{\frac{1}{3}}}\Bigr) \Bigl(p(0,1)+p(1,1)+p(2,1)\Bigr) \\[4pt] =\;& \Bigl({\small{\frac{1}{3}}}\Bigr) \Bigl({\small{\frac{1}{4}}}+{\small{\frac{1}{2}}}+{\small{\frac{1}{4}}}\Bigr) \\[4pt] =\;& {\small{\frac{1}{3}}} \\[4pt] \end{align*}