Given markov chain $X$, how to prove following property by markov property:
$$P(X_{n+1} = s | X_{n_1} = x_{n_1}, X_{n_2} = x_{n_2}, ...,X_{n_k} = x_{n_k} ) = P(X_{n+1} = s | X_{n_k} = x_{n_k}) , \quad \forall 0 \le n_1 < n_2 < ...<n_k \le n$$
I can't figure out how to prove it?
Any hint?
2026-03-30 00:05:35.1774829135
Proof of property of Markov chain
466 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
Just pad the gap between $X_{n_j}$ and $X_{n_{j + 1}}$, $j = 0, \ldots, k$. Formally, this means we introduce the omitting random vectors $(X_{n_j + 1}, \ldots, X_{n_{j + 1} - 1})$ by $Y_{n_j}, j = 0, \ldots, k$ (fix $n_0 = 0, n_{k + 1} = n$).
By law of total probability and the Markov property: \begin{align} & P[X_{n + 1} = s | X_{n_k} = x_{n_k}, \ldots, X_{n_1} = x_{n_1}] \\ = & \frac{P[X_{n + 1} = s, X_{n_k} = x_{n_k}, \ldots, X_{n_1} = x_{n_1}]}{P[X_{n_k} = x_{n_k}, \ldots, X_{n_1} = x_{n_1}]} \\ = & \frac{\sum_{y_{n_0}, \ldots, y_{n_k}}P[X_{n + 1} = s, Y_{n_k} = y_{n_k}, X_{n_k} = x_{n_k}, \ldots, Y_{n_1} = y_{n_1}, X_{n_1} = x_{n_1}, Y_{n_0} = y_{n_0}]}{\sum_{y_{n_0}, \ldots, y_{n_k}}P[Y_{n_k} = y_{n_k}, X_{n_k} = x_{n_k}, \ldots, Y_{n_1} = y_{n_1}, X_{n_1} = x_{n_1}, Y_{n_0} = y_{n_0}]} \\ = & \frac{\sum_{y_{n_0}, \ldots, y_{n_k}}P[X_{n + 1} = s, Y_{n_k} = y_{n_k}| X_{n_k} = x_{n_k}, \ldots, Y_{n_1} = y_{n_1}, X_{n_1} = x_{n_1}, Y_{n_0} = y_{n_0}]P[X_{n_k} = x_{n_k}, \ldots, Y_{n_1} = y_{n_1}, X_{n_1} = x_{n_1}, Y_{n_0} = y_{n_0}]}{\sum_{y_{n_0}, \ldots, y_{n_k}}P[Y_{n_k} = y_{n_k} | X_{n_k} = x_{n_k}, \ldots, Y_{n_1} = y_{n_1}, X_{n_1} = x_{n_1}, Y_{n_0} = y_{n_0}]P[X_{n_k} = x_{n_k}, \ldots, Y_{n_1} = y_{n_1}, X_{n_1} = x_{n_1}, Y_{n_0} = y_{n_0}]} \\ = & \frac{\sum_{y_{n_0}, \ldots, y_{n_k}}P[X_{n + 1} = s, Y_{n_k} = y_{n_k}| X_{n_k} = x_{n_k}]P[X_{n_k} = x_{n_k}, \ldots, Y_{n_1} = y_{n_1}, X_{n_1} = x_{n_1}, Y_{n_0} = y_{n_0}]}{\sum_{y_{n_0}, \ldots, y_{n_k}}P[Y_{n_k} = y_{n_k} | X_{n_k} = x_{n_k}]P[X_{n_k} = x_{n_k}, \ldots, Y_{n_1} = y_{n_1}, X_{n_1} = x_{n_1}, Y_{n_0} = y_{n_0}]} \quad \text{(Markov Property)} \\ = & \frac{\sum_{y_{n_0}, \ldots, y_{n_{k - 1}}}P[X_{n + 1} = s| X_{n_k} = x_{n_k}]P[X_{n_k} = x_{n_k}, \ldots, Y_{n_1} = y_{n_1}, X_{n_1} = x_{n_1}, Y_{n_0} = y_{n_0}]}{\sum_{y_{n_0}, \ldots, y_{n_{k - 1}}}P[X_{n_k} = x_{n_k}, \ldots, Y_{n_1} = y_{n_1}, X_{n_1} = x_{n_1}, Y_{n_0} = y_{n_0}]} \quad \text{(Do the sum with respect to $y_{n_k}$ first)} \\ = & P[X_{n + 1} = s | X_{n_k} = x_{n_k}]. \end{align}
To answer your concern, we can prove in a more general sense, i.e., for any positive integers $m, k$, it follows that $$P[X_{m + k} = x_{m + k}, \ldots, X_{m + 1} = x_{m + 1} | X_m = x_m, \ldots, X_0 = x_0] = P[X_{m + k} = x_{m + k}, \ldots, X_{m + 1} = x_{m + 1} | X_m = x_m]. \tag{1}$$
By multiplicative rule and Markov property, the left hand side of $(1)$ equals to $$P[X_{m + k} = x_{m + k} | X_{m + k - 1} = x_{m + k - 1}] \cdots P[X_{m + 1} = x_{m + 1} | X_m = x_m],$$ whence to prove $(1)$, it suffices to prove $$ P[X_{m + j} = x_{m + j} | X_{m + j - 1} = x_{m + j - 1}] = P[X_{m + j} = x_{m + j} | X_{m + j - 1} = x_{m + j - 1}, \ldots, X_m = x_m]. \tag{2}$$ for $j = 1, \ldots, k$.
To prove $(2)$, let $Z$ denote the vector $(X_0, X_1, \ldots, X_{m - 1})$, again by law of total probability, the right hand side of $(2)$ equals to: \begin{align} & P[X_{m + j} = x_{m + j} | X_{m + j - 1} = x_{m + j - 1}, \ldots, X_m = x_m] \\ = & \sum_{z} P[X_{m + j} = x_{m + j}, Z = z | X_{m + j - 1} = x_{m + j - 1}, \ldots, X_m = x_m] \\ = & \sum_{z} P[Z = z | X_{m + j - 1} = x_{m + j - 1}, \ldots, X_m = x_m]P[X_{m + j} = x_{m + j} | X_{m + j - 1} = x_{m + j - 1}, \ldots, Z = z] \\ = & \sum_{z} P[Z = z | X_{m + j - 1} = x_{m + j - 1}, \ldots, X_m = x_m]P[X_{m + j} = x_{m + j} | X_{m + j - 1} = x_{m + j - 1}] \\ = & P[X_{m + j} = x_{m + j} | X_{m + j - 1} = x_{m + j - 1}] \sum_z P[Z = z | X_{m + j - 1} = x_{m + j - 1}, \ldots, X_m = x_m] \\ = & P[X_{m + j} = x_{m + j} | X_{m + j - 1} = x_{m + j - 1}]. \end{align}
This completes the proof.