Relationship between $\exp (t P)=\sum_{n=0}^{\infty} \frac{t^{n}}{n !} P^{n}$ and transition probability matrix

58 Views Asked by At

I am having trouble understanding the relationship between a transition probability matrix and $\exp (t P)=\sum_{n=0}^{\infty} \frac{t^{n}}{n !} P^{n}$.

I have a transition probability matrix as below:

$P=\left(\begin{array}{ccc}p & p & 1-2 p \\ p & 1-2 p & p \\ 1-2 p & p & p\end{array}\right)$

I found that $\lim _{n \rightarrow \infty} P^{n}$ = $\left(\begin{array}{ccc}\frac{1}{3} & \frac{1}{3} & \frac{1}{3} \\ \frac{1}{3} & \frac{1}{3} & \frac{1}{3} \\ \frac{1}{3} & \frac{1}{3} & \frac{1}{3} \end{array}\right)$.

My intuition says $\exp (t P)=\sum_{n=0}^{\infty} \frac{t^{n}}{n !} P^{n}$ = $\lim _{n \rightarrow \infty} P^{n}$ But I am not sure how do I show it or I am on the right path?

1

There are 1 best solutions below

0
On BEST ANSWER

Your matrix $P$ is not always a transition probability matrix because, for $p>1/2\,$ the diagonal elements are negative. Let's assume $p\in[0,1/2]\,.$ Writing $$ P^\infty:=\frac{1}{3}\left(\begin{matrix}1&1&1\\1&1&1\\1&1&1\end{matrix}\right) $$ it is quite easy to see that $PP^\infty=P^\infty P=P^\infty\,.$ This is another way of showing that $\lim_{n\to\infty}P^n=\frac{1}{3}\left(\begin{smallmatrix}1&1&1\\1&1&1\\1&1&1\end{smallmatrix}\right)\,.$

It is not true that $\lim_{n\to\infty}P^n=\exp(tP)$ holds, simply because the exponential depends on $t$ and the limit does not.

When $P$ has a matrix logarithm (which is not always the case) then $$ P^n=\exp(n\log P) $$ which is probably what you were after. Since your $P$ is symmetric and real it is hermitian and therefore has real eigenvalues (obviously $v_1=(1,1,1)$ is always an eigenvector of $P$ with eigenvalue $\lambda_1=1)\,.$ It also follows that $P$ is diagonalizable: $D=V^\top PV$ where the columns of $V$ are the eigenvectors of $P$ and the diagonal elements of $D$ the eigenvalues $\lambda_1,\lambda_2,\lambda_3\,.$ The matrix logarithm of a diagonalizable matrix is defined as $$ \log P=V\left(\begin{matrix}\log\lambda_1&0&0\\0&\log\lambda_2&0\\0&0&\log\lambda_3\end{matrix}\right)V^\top $$ (see Wikipedia). Obviously this exists if and only if all eigenvalues of $P$ are strictly positive. We also know from $\lambda_1=1$ that $\log\lambda_1=0\,.$

It is not hard to see that the eigenvalues $\lambda_2$ and $\lambda_3$ of $P$ are identical and equal to $1-3p\,.$ Therefore, $\log P$ exists if and only if $p\in[0,1/3)\,.$ In particular, for your simple matrix $P$ there exists a socalled generator matrix $Q$ satisfying $P=e^Q$ if and only if $p\in[0,1/3)\,,$ namely $Q=\log P\,.$ Looking at the papers [1] and [2] and references therein, it is apparent that, in general dimensions and for Markov matrices that are only stochastc and not necessarily doubly stochastic like yours, the question if a generator matrix exists is quite a difficult one.

[1] https://arxiv.org/pdf/1001.1693.pdf

[2] https://arxiv.org/pdf/1605.03502.pdf