Markov chain - distribution of probability of state at generic step

215 Views Asked by At

Let $S$ be a finite discrete state set. Let $X(i) \in S, i = 1,2, \ldots$ be a random variable sequence.

I've built-up a Markov transition matrix from a set of sequences of states. State $s_1 \in S$ appears only and always at the beginning of any sequence of the set, $s_2 \in S$ appears only and always at the end of any sequence of the set. OF course $s_2$ is the only stationary state starting from $s_1$.

If I plot the histogram of the length of the sequences I got a distribution which I suppose to be related to either a Poisson or gamma distribution. I suppose my problem is somehow related with the time of arrival problem... I can't find some theory about it on Google which let me understand if the distribution is a gamma or a Poisson. Moreover my distribution is discrete and not continuous.

Can someone help me?

1

There are 1 best solutions below

0
On BEST ANSWER

Every distribution on the positive integers can be the distribution of the hitting time of one of its states by a countable Markov chain.

To show this, consider some distribution $\nu$ on the positive integers. For every $n\geqslant1$ such that $\nu([n,+\infty))\ne0$, let $$r_n=\frac{\nu(\{n\})}{\nu([n,+\infty))},$$ and note that the numbers $r_n$ for $n\geqslant1$ such that $\nu([n,+\infty))=0$, if any, will not be needed.

Consider a Markov chain $(X_k)_{k\geqslant0}$ on the state space $\{0,1,2,\ldots\}$, such that $X_0=1$, with transition probabilities $r_n$ for $n\to0$ and $1-r_n$ for $n\to n+1$, the other transitions having all probability zero.

Then, every path of the chain is $(1,2,3,\ldots,N-1,N,0,0,0,\ldots)$, for some random $N\geqslant1$ with distribution $$P(N=n)=(1-r_1)(1-r_2)\cdots(1-r_{n-1})r_{n},$$ for every $n\geqslant1$, that is, using $\nu([1,+\infty))=1$,$$P(N=n)=\frac{\nu([2,+\infty))}{\nu([1,+\infty))}\frac{\nu([3,+\infty))}{\nu([2,+\infty))}\cdots\frac{\nu([n,+\infty))}{\nu([n-1,+\infty))}\cdot\frac{\nu(\{n\})}{\nu([n,+\infty))}=\nu(\{n\}).$$ Thus, the hitting time $N\geqslant1$ of the state $0$ by the Markov chain $(X_k)_{k\geqslant0}$ has distribution $\nu$, as desired.