Question on MIT video on Markov matrices

115 Views Asked by At

Markov matrices are pretty new to me and I'm a little rusty with my linear algebra. My question stems from watching this video from YouTube on Markov matrices. For those who wish to skip the video, the following notation was used. The location of a particle after $n$ steps is $p_n$, $A$ was a Markov matrix for the process and $p_\infty=\lim\limits_{n\to\infty}p_n$.

During the video he claims that every Markov matrix has an eigenvalue of $1$, but does not explain why (from a previous lecture perhaps). More importantly, at the very end, he seems to strongly imply but does come out and say that $p_\infty$ is always a multiple of the eigenvector corresponding to the eigenvalue of $1$. If this should be true, is this because if a limit should exist, we want

$$Ap_\infty=p_\infty\text?$$

2

There are 2 best solutions below

0
On BEST ANSWER

Usually, people deal with a right stochastic matrix i.e. each row contains non-negative real numbers and sum to $1$.

The instructor on the video uses a left stochastic matrix i.e. each column contains non-negative real numbers and sum to $1$.

Following the convention the instructor follows, note that for a $n \times n$ left stochastic matrix, we have $$\begin{bmatrix} 1 & 1 & \cdots & 1 \end{bmatrix}A = \begin{bmatrix} 1 & 1 & \cdots & 1 \end{bmatrix}$$ since each column sum of $A$ is $1$. Hence, $1$ is a left eigenvalue of $A$ and $\begin{bmatrix} 1 & 1 & \cdots & 1 \end{bmatrix}$ is a left eigenvector corresponding to the eigenvalue $1$ for the left stochastic matrix. Recall that the left and right eigen values of $A$ are same. Hence, $1$ is an eigenvalue of $A$.

Now at "steady state", we have $$p_{\infty}(i) = \sum_{j} A(i,j)p_{\infty}(j)$$ Hence, we have $$Ap_{\infty} = p_{\infty}$$ i.e. $p_{\infty}$ is the right-eigenvector corresponding to the eigenvalue $1$ for the left stochastic matrix.

2
On

A stochastic matrix is one where the sum of each row is equal to $1$. A Markov matrix is a stochastic matrix. Now, notice that the vector $v=(1,1,1,\cdots ,1)\in \mathbb R^n$ is an eigenvector of any stochastic $n\times n$ matrix $M$ with eigenvalue $1$ (simply because when you multiply $M\cdot v$ the entry of each entry in the result is the sum of the corresponding row in $M$, which is $1$ by stochasticity, so $M\cdot v=v$). So, any Markov matrix has eigenvalue $1$. Now, a stationary distribution $p_\infty $ satisfies $M\cdot p_\infty =p_\infty $ so $p_\infty $ is an eigenvector corresponding to the eigenvalue $1$. However, $p_\infty $ should also be a distribution, so should have non-negative entires summing up to $1$. Such a vector does not necessarily have to be a multiply of the eigenvector above (the eigenspace corresponding to the eigenvalue $1$ could have dimension greater than $1$).