Prove that something is a Markov chain

7.8k Views Asked by At

Let $\xi_0, \xi_1, \xi_2, ...$be independent, identically distributed, integer valued random variables. Define $Y_n$ = max{$\xi_i: 0 \leq i \leq n$}. Show that $(Y_{n)n\geq0}$ is a Markov chain and find it's transition matrix.

First time user here.

I just got into Markov's chains and is continuously struggling with this type of question. I understand that we have to prove this using the Markov property by showing that $P(Y_{n+1} = \xi_{n+1} | Y_n = \xi_n, ..., Y_0=\xi_0) = P(Y_{n+1} = \xi_{n+1}|Y_n = \xi_n)$. My thought is to first show that $P(Y_{n+1} = \xi_{n+1} | Y_n = \xi_n, ..., Y_0=\xi_0) = P(Y_{n+1} = \xi_{n+1})$ by independence, but I don't now how to proceed from here on.

Also, will the transition matrix be $P_{ij} = {1/n}$?

Help is much appreciated.

2

There are 2 best solutions below

2
On BEST ANSWER

The shortest approach might be to note that $$ Y_{n+1}=\max\{\xi_{n+1},Y_n\}, $$ in particular, $Y_{n+1}$ is $\sigma(\xi_{n+1},Y_n)$-measurable. Since $\xi_{n+1}$ is independent of $(Y_k)_{0\leqslant k\leqslant n}$, this shows that $(Y_n)_{n\geqslant0}$ is a Markov chain.

The transition probabilities are $$ P_{ij}=P(Y_{n+1}=j\mid Y_{n}=i)=P(\max\{\xi_0,i\}=j), $$ thus, $$ P_{ii}=P(\xi_0\leqslant i),\qquad P_{ij}=P(\xi_0=j)\ (j\gt i),\qquad P_{ij}=0\ (j\lt i). $$

1
On

You are correct in saying you want to prove the Markov property.

First, the $ \xi_i $ are random variables. So a statement like $ \mathbb{P}(Y_{n+1} = \xi_{n+1}) = \ ... $ does not quite make sense here if the $ \xi_i $ are to be random variables and not the values assumed by the random variables.

Here's a hint, replace the $ \xi_i $ in your formulation of the Markov property with lower case letters, to represent the integer values assumed by the random variables $ \xi_i $. E.g. something along the lines of $\mathbb{P}(Y_{n+1} = a_{n+1}) \ ... $ That might help you make sense of things a bit better, I hope.