Martingale and Absorbing states in a Markov Chain

447 Views Asked by At

I have problem with this question. I've already proved a), but I can't get through b).
Define $ (X_n)_{n \in \mathbb{N}} $ a Markov Chain with state-space $S = \{0,1, \dots , N \} $ with transition probabilities such that por any state $x$

$$ \mathbb{E} (X_{n+1} | X_n = x) = \sum_{y=0}^{N} y P(x,y) = x$$

Prove:

a) $ (X_n)_{n \in \mathbb{N}} $ is a martingale, and
b) $0$ and $N$ are absorbing states.

For proving b), I thought of using the martingale property; nonetheless, I did not get anything. Also, I approach by using the Law of total expectation, but I got stuck.

I would appreciate any help. Thank you in advance!

2

There are 2 best solutions below

0
On BEST ANSWER

Now that I could solve b). I post my answer for you all.

$\textbf{Answer for a)}$

First, by construction of $(X_n)_{n \in \mathbb{N}}$ is adapted in the natural filtration, that is $\mathcal{F}_n^X = \sigma( \{ X_0, \dots , X_n \})$ for all $n \in \mathbb{N}$.

Second, let's see if $X_n \in L^1$. As the stat space is $S = \{0,1, \dots , N\}$ then we obtain the following

$$\mathbb{E} (|X_n|) = \mathbb{E} (X_n) = \sum_{x=0}^{N} x \mathbb{P}(X_n = x) \leq \sum_{x=0}^{N} x = \dfrac{N(N+1)}{2} < + \infty$$

As result, $X_n \in L^1$.

Finally, let's look for the martingale property. We can see that

$$\mathbb{E} (X_{n+1} | \mathcal{F}_n^X) = \mathbb{E}(X_{n+1}| X_0,X_1, \dots, X_n)$$

If we consider that $(X_n)_{n \in \mathbb{N}}$ is a Markov Chain and analize that for all $x_0,x_1, \dots , x_n$, we can infer that

$$\begin{split} \mathbb{E}(X_{n+1}| X_0 = x_0, \dots, X_n = x_n) & = \sum_{y=0}^{N} y \mathbb{P}(X_{n+1}=y|X_0 = x_0, \dots, X_n = x_n)\\ & = \sum_{y=0}^N y \mathbb{P} ( X_{n+1} = y| X_n = x) \\ & = \mathbb{E} (X_{n+1} | X_n = x_n) = x_n \end{split}$$

We get the last equality by hypothesis. Then, we conclude that

$$ \mathbb{E}(X_{n+1}| X_0 = x_0, \dots, X_n = x_n) = x_n$$

Thus, after generalizing,

$$ \mathbb{E}(X_{n+1}| X_0 , \dots, X_n ) = X_n \iff \mathbb{E}(X_{n+1}| \mathcal{F}_n^X) = X_n$$

We conclude that $(X_n)_{n \in \mathbb{N}}$ is a martingale.

$\textbf{Answer for b)}$

First, we consider the stat $x=0$. We get that, by hypothesis

$$\sum_{y=0}^N y P(0,y) = 0 \iff 0P(0,0) + \sum_{y=1}^N y P(0,y) = 0 \tag{1}$$

Let's remember that $P(x,y)$ denotes the transition probability from $x$ to $y$ in one step. Now, as $P$ is an estochastic matrix, we know that, for any state $x$

$$ \sum_{y=0}^N P(x,y) = 1 \tag{2} $$

Combining $(1)$ and $(2)$ equations, we get that $P(0,0)=1$.

Second, for the last state $x=N$ we get that

$$ \sum_{y=0}^N y P(N,y) = N \iff \sum_{y=0}^N \frac{y}{N} P(N,y) = 1 \tag{3} $$

Now, if we combine the equations $(2)$ and $(3)$ we get that

$$\sum_{y=0}^N \Big( 1 - \frac{y}{N} \Big) P(N,y) = 0$$

$$\sum_{y=1}^N \Big( 1 - \frac{y}{N} \Big) P(N,y) + \Big( 1 - \frac{N}{N} \Big) P(N,N) = 0$$

As a final result $P(N,N) = 1$.

So, we conclude that the absorbing states of the Markov Chain are $x=0,N$.

4
On

You have, by specializing to $x=0$, $$ 0=\sum_{y=0}^N yP(0,y). $$ What conclusion can be drawn from this?