what is the difference between a markov chain and a random walk?

1.6k Views Asked by At

I am confused as to the relation between Markov chains and random walks. Some sources I look at claim all random walks are markov chains and then other sources say the opposite. What makes a markov chain a random walk?

2

There are 2 best solutions below

0
On BEST ANSWER

If you define random walk as here , that's $(S_n)_{n\ge0}$ defined by $S_0 = 0 , S_n = \sum_{k=1}^n X_k$ for i.i.d $(X_k)_{k\ge1}$ . Then $\text{it's a Markov Chain}$ .


If you use another definition : From the first line of each random walk and Markov Chain , I think a Markov chain models a type of random walk , but it doesn't model all random walks .

You can easily construct random walks that aren't Markov chain , e.g here .

Think about what we mean by Randomness .

0
On

Epistemologically speaking, a random walk does not possess the essential properties of a Markov chain: transition probabilities. An instantiation of a random walk is, to say the least, an utterly random path in mathematical space. From this path, we cannot go back and deduce the transition probabilities. However, we could technically compute the average transition probability from each state. On the other hand, specifying a random walk a priori (not considering a particular instantiation), it seems to possess the qualities of a markov chain. This is not to say that there are no distinctions to be made. Consider the most elementary random walk, the birth and death chain on the real line. With chance of going up and down the chain in single increments constituting this random walk, it is indeed a markov chain. However, consider a situation where we have an animal foraging for food. Its path can be considered a random walk. Though, it is important to note that the direction it faces also figures into the places visited, so that we must incorporate additional information than merely its current position.