Need help with a basic exercise about Markov chains

415 Views Asked by At

Suppose $\left\{ X_{n}\right\} _{n=1}^{\infty}$ is a Markov Chain taking real values. Are the following Markov Chains? $$Y_{n}=\sum_{i=1}^{n}X_{i} , Z_{n}=\left(X_{n},X_{n-1}\right)$$ Edit1 I removed some of the nonsense I wrote previously as it was quite wrong as pointed out by Lost1's answer.

Edit2 I have a followup question. Consider we now know that $Y_{n}$ is not necessarily a Markov Chain, what about about $Z_{n}=(Y_{n-1},Y_{n})$? My hopefully new found intuition tells me this should be a Markov chain since asking a question about pairs of $Y$ values is equivalent to a question about $X$ values. I tried to formalize it but I'm not sure whether my reasoning is correct: $$\mathbb{P}\left(Z_{n}=\left(y_{n-1},y_{n}\right)\,|\, Z_{n-1}=\left(y_{n-2},y_{n-1}\right),Z_{n-2}=\left(y_{n-3},y_{n-2}\right),...,Z_{2}=\left(y_{1},y_{2}\right)\right) =\mathbb{P}\left(Y_{n-1}=y_{n-1},Y_{n}=y_{n}\,|\, Y_{n-1}=y_{n-1},Y_{n-2}=y_{n-2},...,Y_{1}=y_{1}\right) =\mathbb{P}\left(X_{n}=y_{n}-y_{n-1}|\, X_{n-1}=y_{n-1}-y_{n-2},...,X_{1}=y_{2}-y_{1}\right)=\mathbb{P}\left(X_{n}=y_{n}-y_{n-1}|\, X_{n-1}=y_{n-1}-y_{n-2}\right) \overbrace{=}^{\dagger}\mathbb{P}\left(Y_{n-1}=y_{n-1},Y_{n}=y_{n}\,|\, Y_{n-1}=y_{n-1},Y_{n-2}=y_{n-2}\right)=\mathbb{P}\left(Z_{n}=\left(y_{n-1},y_{n}\right)\,|\, Z_{n-1}=\left(y_{n-1},y_{n-2}\right)\right) $$ Where the first two transitions are a result of the events in question being equivalent and the same for the last two transitions (this is the part I'm unsure of, in particular I'm not convinced of the marked equality).

I'd appreciate it if someone could tell me whether what I wrote is correct. appreciated.

1

There are 1 best solutions below

6
On BEST ANSWER

$Y_n$ is not a Markov Chain, $Z_n$ is a Markov Chain.

Your proof is clearly wrong. You can easily find an example where the Markov Chain is not a random walk

Even take the Markov Chain with two states: 1<->2 where $p_{12}=0.3=p_{21}$, $0.7=p_{11}=p_{22}$. Basically if you only know the sum up to time $n$, assuming that $Y_n\geq 2$, you don't know which state you are in, so you have no idea what the transition probability is.

$Z_n$ is clearly a Markov Chain. What $Z_n$ is doing is remembering an extra state. All the exciting things are actually happening in $X_n$. When conditioning $(X_n,X_{n-1})$ on $(X_{n-1},X_{n-2})$, it is the same as conditioning on $X_{n-1}$ by Markov property of $X$. the second component of $Z_n$ is a deterministic function of this! To make this waffle precise.

$P(Z_n=(a,b)|Z_{n-1}=(c,d)) = P(X_n=a, X_{n-1}=b|X_{n-1}=c,X_{n-2}=d)=P(X_n=a,X_{n-1}=b|X_{n-1}=b,X_{n-2}=d,... X_0=...)=P(X_n=a,X_{n-1}=b|Z_{n-1}=(b,d),Z_{n-2}=(...),...,Z_1=(...))= $

Note I used the Markov property of $X$ and the observation that conditioning on $X_0,...X_{n-1}$ is the same as conditioning on $Z_1$,...$Z_{n-1}$.

This is what I would do after the last line before the dagger.

$$P(X_n=y_n-y_{n-1}|X_{n-1}=y_{n-1}-y_{n-2}) =P(X_n=y_n-y_{n-1}|X_{n-1}=y_{n-1}-y_{n-2},Y_{n-1}=y_{n-1}) =P(X_n=y_n-y_{n-1}|Y_{n-1}=y_{n-1},Y_{n-2}=y_{n-2}) =P(X_n=y_n-y_{n-1},Y_{n-1}=y_{n-1}|Y_{n-1}=y_{n-1},Y_{n-2}=y_{n-2}) =P(Y_n=y_n,Y_{n-1}=y_{n-1}|Y_{n-1}=y_{n-1},Y_{n-2}=y_{n-2}) $$

Note where the first equality holds because $Y_{n-1}$ is a function of $X_1,...X_{n-1}$. I am using a consequence of the Markov property: Conditioning on functions of $X_1,...X_{n-1}$ and $X_{n-1}$ is the same as conditioning just on $X_{n-1}$. This is a standard argument if you are familiar with measure theory.