Show that $\mathbb{P}(T>n)=p(1-q)^{n-2},$ for $n\geq 2.$

128 Views Asked by At

Given the state space $S=\{1,2\}$ and the transition matrix for the two-state chain is given by

$$\mathbf{P}=\begin{pmatrix} 1-p & q \\ p & 1-q \end{pmatrix},$$

where $p$ and $q$ are not both $0$. Let $T$ be the first return time to state $1$, for the chain started in $1$.

a) Show that $\mathbb{P}(T\geq n)=p(1-q)^{n-2},$ for $n\geq 2.$

b) Find $\mathbb{E}[T]$ and verify that $\mathbb{E}[T]=1/\pi_1$ where $\pi$ is the stationary distribution of the chain.

a) Since this is a discrete case, we have that

$$\mathbb{P}(T\geq n)=1-\mathbb{P}(T\leq n)=1-(\mathbb{P}(T=2)+ ... +\mathbb{P}(T=n)),$$

but how do I compute the $\mathbb{P}(T=k)$?

b) If I can obtain an expression for $\mathbb{P}(T> n)$, and given that $T$ is a discrete random variable, i can compute this expectation as

$$\mathbb{E}[T]=\sum_{k=2}^{n}\mathbb{P}(T>k),$$

is this correct? If it is, then I only need help to proceed with a).

1

There are 1 best solutions below

4
On BEST ANSWER

$T$ is a random variable taking values in the non-negative integers and infinity(the chain never returns to one).

In this case, we have the formula : $\mathbb E[T] = \sum_{k=1}^\infty P(T \geq k)$. Therefore, from part $a$ one can obtain the result of part $b$.

As for part $a$ itself, naturally $T \geq n$ if and only if the chain is in state two from time $2$ to at least time $n-1$. This is by definition of what first return time means : the first return time to $1$ starting from $1$ is greater than or equal to $n$, if and only if the chain does not go to state $1$ from time $2$ to $n-1$, if and only if (in our case) the chain is in state two from time $2$ to time $n-1$.

Your comment does not apply : if the transition $1\to 1$ occurs, then the chain will have returned to state $1$ at time $1$, so the return time is $1$. Remember, the first return time does not take into account repeats and then leaving and coming back : a repeat is a return time of $1$.

But, for the chain to be in state two from time $2$ to $n-1$ it has to transition like : $$ 1 \xrightarrow{1} 2 \xrightarrow{2} 2 \xrightarrow{3} 2 \xrightarrow{4}\ ... \xrightarrow{n-2} 2 \xrightarrow{n-1} \mbox{anywhere} $$

which means that the desired probability is the probability that the given transitions occur in that order, which is easily seen to be probability of $1 \to 2$, which is $q$, times probability of $2 \to 2$ exactly $n-2$ times, which is $(1-q)$. This results in $q \times (1-q)^{n-2}$.

Therefore, by our expectation formula, the expected return time is $P(T \geq 1) + \sum_{k=2}^\infty P(T \geq k)$, which leads to $1 + q\sum_{n=2}^\infty(1-q)^{n-2}$ (The splitting of $k = 1$ and $k\geq 2$ is important here : $T \geq 1$ happens with probability $1$). By the geometric series formula, this equals $1+\frac qq = 2$.

Now, the stationary distribution for the given Markov chain can be found by definition of a stationary distribution: $$(\pi_1,\pi_2) = ((1-p)\pi_1+p\pi_2,q\pi_1+(1-q)\pi_2)$$

Noting that $\pi_1+\pi_2 = 1$ allows us to get $\pi = (\frac 12,\frac 12)$. Verify part $b$ from here.