Stochastic Processes Question

74 Views Asked by At

Give an example of a stochastic process $X_{n}$ that is not a Markov chain, such that $P_{y}(N(y)=\infty)=0$ but $E_{y}N(y)=\infty$

1

There are 1 best solutions below

0
On

Here is an example related to your question.

Consider some i.i.d. positive integer valued $(Y_x)_{x\geqslant1}$. Let $Z_0=0$ and $Z_x=Y_1+\cdots+Y_x$ for every $x\geqslant1$. For every $n\geqslant0$, let $X_n=\max\{x\geqslant0\mid Z_x\leqslant n+Z_{X_0}\}$.

In words, $X_{n+1}$ is almost surely either $X_n$ or $X_n+1$, and the process $(X_n)_{n\geqslant0}$ stays at $x$ during $Y_x$ time steps.

The following holds:

  • The number of visits to $x\geqslant X_0$ is $Z_x$.
  • The number of visits to $x\geqslant X_0$ is almost surely finite.
  • The number of visits to $x\geqslant X_0$ is integrable iff $Z_x$ is integrable.
  • The process $(X_n)_{n\geqslant0}$ is a Markov chain iff, for every $x\geqslant X_0$, $Z_x$ has a geometric distribution.