Max of independent and identical random variables is Markov

864 Views Asked by At

I'm supposed to show that given a sequence $\{Y_n\}$ of i.i.d the stochastic process $$X_n=\max(Y_0, Y_1...,Y_n)$$ is a Markov of chain.

I think I could do it by induction but I would rather see how it is done by using principles of conditional probability as this would further my understanding more.

Any pointers or solutions are much appreciated.

Regards

ZMI

1

There are 1 best solutions below

0
On BEST ANSWER

The simplest approach: $X_0=Y_0$ and, for every $n$, $$X_{n+1}=A(X_n,Y_{n+1}),\qquad A(x,y)=\max\{x,y\}.$$ And now, watch the results fall in line like dominoes:

  • Initial distribution: the distribution of $Y_0$
  • Markov property: obvious since $X_{n+1}$ is a deterministic function of the present state $X_n$ and of a new input $Y_{n+1}$ which is independent of the past $(X_k)_{k\leqslant n}$ since the past $(X_k)_{k\leqslant n}$ depends on $(Y_k)_{k\leqslant n}$ only and $Y_{n+1}$ is independent of $(Y_k)_{k\leqslant n}$
  • Transition probabilities: for every $y\gt x$, the transition $x\to y$ has probability $P(Y_{n+1}=y)=P(Y_0=y)$ and the only other transition with positive probability is the transition $x\to x$ with probability $P(Y_{n+1}\leqslant x)=P(Y_0\leqslant x)$