Is this stochastic process a Markov chain?

695 Views Asked by At

I have been struggling sometime now with the following question and I feel like I am stacked.

Let $X_n : n= 0,1,\ldots$ be a sequence of iid discrete random variables with $$P(X_n=j)=a_j>0 \qquad \text{ for } j=0,1,2...$$

Is $Y_n=\min(X_0,X_1,\ldots,X_n)$ a Markov chain?

Any help would be much appreciated.

1

There are 1 best solutions below

6
On BEST ANSWER

The sequences $(Y_n)_n$ of the $\min$'s and $(M_n)_n$ of the $\max$'s are Markov Chains. As above, define $$Y_n:=\min\{X_0,X_1,\ldots X_n\}$$ and observe that given the values $Y_0,\ldots Y_n$ the value of $Y_{n+1}$ depends only on the value of $Y_n$ since $$Y_{n+1}=j \iff \min\{X_{n+1}, y_{n}\}=j \iff \begin{cases} y_n>j, X_{n+1}=j \text{ or }\\ y_n=j, X_{n+1}\ge j \end{cases}$$ Therefore the probability of $Y_{n+1}$ depends only on $Y_{n}$, i.e. $$\begin{align*}P(Y_{n+1}=j\mid Y_0=y_0, \ldots, Y_{n}=y_{n})&=P(Y_{n+1}=j|Y_n=y_n)\\\\&=\begin{cases}P(X_n=j)=a_j, & \text{ if } 0\le j<y_n \\ P(X_n\ge j)=\sum_{i\ge j}a_i, & \text{ if } y_n=j\\0, & \text{ if } y_n< j \end{cases}\end{align*}$$ since the minimum cannot increase. Similarly for $\max$ where you should be aware that the maximum cannot decrease.


Necessary for the above result is the fact that the $X_n$ are independent and identically distributed. Otherwise it would not hold.