Expected value of a series of random variables in a markov chain

286 Views Asked by At

I have a Markov Chain such that $X_n = \max(X_{n-1}+\xi _n,0)$ where the $\xi_n$ series is independent and identically distributed. I want to show that if $\mathbb E(\xi_n) > 0$ (where $\mathbb E(\xi_n)$ is the expected value of $\xi_n$) then$\frac{X_n}{n}$ tends to $\mathbb E(\xi_n)$ as n approaches infinity for any choice of $X_0$.

And... I have no idea how to show this. I know that repeated application of the above will yield $X_n = \max\left(X_0+\sum\limits_{i=1}^n\xi_i, \sum\limits_{i=2}^n\xi_i, \sum\limits_{i=3}^n\xi_i, \dots, 0\right)$, but I'm stuck here.

2

There are 2 best solutions below

0
On

If I say $S_n = \sum\limits_{i=1}^n\xi_i$, I have that $X_n = \max(X_0+S_n, S_n-S_1, S_n-S_2, \dots, 0)$ and that

$$\frac{X_n}n = \max\left(\frac{X_0+S_n}n, \frac{S_n-S_1}n, \frac{S_n-S_2}n, \dots, 0\right)$$

So maybe if I take the limit for $n\rightarrow +\infty$, the above has that

$$\lim\limits_{n\rightarrow +\infty}\frac{X_n}n = \lim\limits_{n\rightarrow +\infty}\frac{S_n}n = \mathbb E(\xi_n)$$

where the last equality holds because of the law of large numbers, and since the expected value is positive, it is the maximum of that list. Is that reasoning correct?

0
On

Repeating the relation $X_n \geq X_{n-1} + \xi_n$ gives $X_n \geq X_0 + S_n$, where $S_n = \sum_{i=1}^{n} \xi_i$. So $\frac{X_n}{n} \geq \frac{X_0}{n} + \frac{S_n}{n}$, then the law of large number gives $\liminf_{n \rightarrow +\infty} \frac{X_n}{n} \geq E(\xi_1)$ almost surely. Since $E(\xi_1) > 0$, it ensures that almost surely for $\omega \in \Omega$, $X_n$ is strictly positive when $n$ is bigger than a certain $N(\omega)$. So after $N(\omega)$, the given iteration actually becomes $X_n = X_{n-1} + \xi_n$, then the result follows obviously