example of an inhomogenenous markov chain

101 Views Asked by At

I have found a very interesting exercice about inhomogeneous markov chains, which I think you are also interested in and hope someone can solve it.

Let $X_{1},X_{2},...$ be a inhomogeneous markov chain with state space $\{0,1\}$ with transition probabilities

$$ \mathbb{P}(X_{n+1} = 1 | X_{n} = 0) = \frac{1}{n^{2}}\\ \mathbb{P}(X_{n+1} = 0 | X_{n} = 1) = \frac{1}{n^{2}}. $$

  1. $X := \lim_\limits{n \to \infty} X_{n}$ almost surely exists.
  2. If $\mathbb{P}(X_{1} = 1) = \mathbb{P}(X_{1} = 0) = \frac{1}{2}$, then $\mathbb{P}(X = 1) = \frac{1}{2}$.
  3. If $X_{1} \equiv 1$ then $\mathbb{P}(X = 1) > \frac{1}{2}$.

Im very interested how you would solve it. The solution was discussed in the lecture but I hope that I see a different approach. Maybe with matrix calculation

2

There are 2 best solutions below

0
On BEST ANSWER

0. Model:

The probabilities of a flip from $0$ to $1$ and from $1$ to $0$ coincide hence one can realize the whole sequence $(X_n)$ pathwisely, as $$X_n=X_1+\sum_{k=1}^{n-1}Y_k\pmod2$$ for every $n\geqslant1$, where $(Y_k)$ is independent Bernoulli and, for each $k$, $$P(Y_k=1)=1-P(Y_k=0)=\frac1{k^2}$$ 1. Convergence:

The series with positive terms $\sum\limits_kE(Y_k)$ converges, hence the random series $\sum\limits_kY_k$ converges almost surely, that is, almost surely, at most finitely many $Y_k$s are not $0$. Thus, the existence of $X=X_1+Y\pmod2$, where $$Y=\sum_{k=1}^\infty Y_k$$

2. Symmetric case:

For every $Y$ independent of $X_1$, if $X_1$ is uniform on $\{0,1\}$ then $X_1+Y\pmod2$ is also uniform on $\{0,1\}$. Thus, if $P(X_1=1)=P(X_1=0)=\frac12$ then, indeed, $X$ and every $X_n$ are uniform on $\{0,1\}$, in particular, $P(X=1)=P(X=0)=\frac12$.

3. Case $X_1=1$ almost surely:

For every integer valued random variable $S$, $$2P(S\ \text{even})=1+E((-1)^S)$$ hence $$2P(X=1)=1+E((-1)^X)=1+E((-1)^{X_1})\prod_{k=1}^\infty E((-1)^{Y_k})$$ that is, if $P(X_1=1)=1$, $$2P(X=1)=1-\prod_{k=1}^\infty\left(1-2P(Y_k=1)\right)$$ hence 3. is equivalent to the assertion that $$\prod_{k=1}^\infty\left(1-\frac2{k^2}\right)<0$$ Every term in this product is positive except the $k=1$ term, which is negative, hence, indeed, if $P(X_1=1)=1$ (and even for every $P(X_1=1)>\frac12$), then $$P(X=1)>\frac12$$

1
On

So the markov matrices is, $$ P_n = \begin{pmatrix} 1-\frac{1}{n^2} & \frac{1}{n^2} \\ \frac{1}{n^2} & 1-\frac{1}{n^2} \end{pmatrix} = \begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix}+\begin{pmatrix} -1 & 1 \\ 1 & -1 \end{pmatrix}\frac{1}{n^2} $$ Hence $$ P_n = I + A\frac{1}{n^2} $$ Now this means that all matrices $P_n$ can be simultaniously diagonalize. Note that for $v_1 = (1,1)^t$ $Av_1 = 0$ and for $v_2 = (1,-1)^t$ we have $$ Av_2 = -2 v_2 $$ Hence to find the final probability matrix we need to calculate $$ \prod_{i=1}^nP_i \to c = \prod_{i=1}^n\left( 1 - \frac{2}{n^2} \right ) $$ Note https://www.wolframalpha.com/input/?i=prod+(1-2%2Fi%5E2) Then we get probability matrix as $$ \prod_{i=1}^nP_i = \frac{1}{2}\begin{pmatrix}1&1\\1&-1\end{pmatrix}\begin{pmatrix}1&0\\0&c\end{pmatrix}\begin{pmatrix}1&1\\1&-1\end{pmatrix} = \begin{pmatrix}\frac{1+c}{2}&\frac{1-c}{2}\\\frac{1-c}{2}&\frac{1+c}{2}\end{pmatrix}, $$ with $c=-0.22$. Now the convergence of the product proves the limit and the evaluation of the matric putting numbers of c in it lead to the answers of 2,3, 3 looks wrong, note that I included the case $n=1$ this case just swaps the states and from this we note that 3 is proven.