Stationary Distribution of a Stochastic Processes

242 Views Asked by At

A Markov Chain with states 0,1,... has transition probabilities $$p_{jk}=e^{-a} \sum_{r=0}^k \left( \begin{matrix} j \\ r \end{matrix} \right) p^r (1-p)^{j-r} a^{k-r} / (k-r)!$$ Show that the limiting distribution $\{ v_k\}$ is Poisson with parameter $a/(1-p)$.

My attempt was using:

$$ \sum_{k=0}^\infty p_{jk} = 1 $$ and

$$ v_k = \sum_{j=0}^\infty v_j p_{jk} $$

But I am really out of order on how to simplify

$$ v_k = \sum_{j=0}^\infty v_j e^{-a} \sum_{r=0}^k \left( \begin{matrix} j \\ r \end{matrix} \right) p^r (1-p)^{j-r} a^{k-r} / (k-r)! $$

3

There are 3 best solutions below

0
On BEST ANSWER

The Markov chain, let's call it $Z=(Z_n)_{n\ge 0}$, looks to be a branching process with immigration. The offspring distribution is Bernoulli with mean value $p$ and the immigration distribution is Poisson with mean $a$. That is to say, $$ Z_{n+1} \stackrel{d}{=} Y_{n+1}+\sum_{k=1}^{Z_n}\xi_k^{(n)}, $$ where $Y_{n+1}$ is Poisson($a$), the $\xi_k^{(n)}$ are are Bernoulli($p$), and all the $Y_n$s and $\xi_k^{(n)}$s are independent.

If there is to be a stationary distribution for this chain, then a random variable $X$ with that distribution will satisfy the distributional identity $$ X \stackrel{d}{=} Y+\sum_{k=1}^X\xi_k,\qquad\qquad (1) $$ where on the right, $Y$, $X$, and the $\xi_k$s are independent and have the same distributions as their earlier discussed namesakes.

Using generating functions it's not hard to see that if $X$ is Poisson($a/(1-p)$) then (1) is true.

Because the branching process with Bernoulli($p$) offspring distribution is subcritical and $P(Y=0)>0$, the Markov chain $Z$ is recurrent. Because the associated invariant measure equation $v=v{\bf P}$ has a solution $v$ with finite total mass (namely, the Poisson($a/(1-p)$) distribution), the process $Z$ is in fact positive recurrent (and clearly aperiodic). It follows from Markov chain theory that $$ \lim_n P(Z_n=k) = e^{-\mu}{\mu^k\over k!},\qquad k=0,1,2,\ldots, $$ where $\mu=a/(1-p)$.

0
On

Let $\mu=a/(1-p)$. Then interchanging order of summation and using $\mu(1-p)=a$, $$S_k:=\sum_{j\ge 0} \frac{\mu^j}{j!} p_{jk}= \sum_{r=0}^k \left(\frac{p}{1-p}\right)^r \frac{a^{k-r}}{(k-r)!} \, e^{-a}\sum_{j=r}^\infty {j \choose r} \frac{a^j}{j!}$$ $$ = \sum_{r=0}^k \left(\frac{p}{1-p}\right)^r \frac{a^{k}}{(k-r)! r!} \, e^{-a}\sum_{j=r}^\infty \frac{a^{j-r}}{(j-r)!}$$ the inner sum yields $e^a$, so by the binomial formula, $$S_k = \frac{a^k}{k!} \sum_{r=0}^k \left(\frac{p}{1-p}\right)^r {k \choose r} = \frac{a^k}{k!} \left(1+\frac{p}{1-p}\right)^k \, =\frac{\mu^k}{k!} \,.$$ Dividing both sides by $e^\mu$ shows that Poisson$(\mu)$ is a stationary distribution for the chain. Thus the chain is positive recurrent and by irreducibility, the stationary measure is unique and also serves as a limiting distribution for any initial distribution.

0
On

This is not a complete answer but rather adds some detail to the excellent conceptual explanation via branching with immigration given by John Dawkins. the key identity (1) is not derived there but relegated to generating functions. Alternatively, (1) can be justified by combining two standard operations on Poisson processes, thinning and superposition, see [1], [2] or [3]. Consider first $\mu$ as an unknown parameter. Thinning a Poisson$(\mu)$ process with thinning probability $p$ yields a Poisson$(\mu p)$ process and superimposing an independent Poisson$(a)$ process will then yield a Poisson$(\mu p+a)$ process. Solving $\mu p+a=\mu$ gives $\mu=a/(1-p)$.

[1] https://en.wikipedia.org/wiki/Point_process_operation

[2] https://www.randomservices.org/random/poisson/Splitting.html

[3] Thinning a Poisson Process