Example of a stochastic non Markov process?

2.6k Views Asked by At

A Markov process is defined as: $$P(X_t| X_{1:t-1}) = P(X_t|X_{t-1})$$

Is there a non Markov process that can be generated by a computer and cannot be converted to a Markov process by changing the variable?

I mean if $X_t$ depends on all previous values, process cannot be generated by a computer since it may depend on infinitely large number of values when t is large.

Also, if for example $X_t$ only depends on previous 2 values, we can define a new variable $Y_t = (X_t, X_{t-1})$ that is Markov.

2

There are 2 best solutions below

2
On BEST ANSWER

Consider a random walk on a line that goes either left or right with probability $q$ and $1-q$ respectively for the first move. For its $n-th$ move, the walker looks at its previous $n-1$ steps (each step being either left or right) and uniformly at random picks one of them. Now in its $n-th$ step, it repeats the randomly chosen step from its memory with probability $p$ or does the opposite with probability $1-p$.

This process is called the Elephant Random Walk and is one of the few exactly solvable non-Markovian processes. It's large $t$ behavior is known both - analytically and numerically.

Reference: https://arxiv.org/abs/cond-mat/0406593

1
On

One such process might be a sequence $X_0, X_1,\dots,$ of bits in which $X_n$ is distributed as Bernoulli($0.75$) if $X_0+X_1+\cdots+X_{n-1} = 0$ (in ${\mathbb F}_2$) and distributed as Bernoulli($0.25$) otherwise. (And the only dependence is this.) It's clearly not Markov since the distribution of $X_n$ depends on the whole history of the process. On the other hand it's implementable on a computer since you just have to keep one bit of information---the running total of the $X_i$.