Markov Matrix with nth Power

1.1k Views Asked by At

I found an MIT video here (video between 5:18 - 9:52). The problem is to find what would $P_n$ be if we know A and $P_0$.

$$A = \begin{bmatrix} 0.6 & 0.2 \\ 0.4 & 0.8 \\ \end{bmatrix}$$

$p_0 =\begin{bmatrix} 1 \\ 0 \\ \end{bmatrix} $

1) The instructor in the video solved like this:

$P_n = A^n \cdot P_0 = PD^nP^{-1} \cdot P_0$

2) I would do it like this:

$p_n = A^n \cdot p_0 = \lambda^n \cdot p_k$

$p_k$ here is the corresponding eigen vector to the eigenvalue. I would take $\lambda = 1$.

So $p_n = 1^k \cdot <1, 2> = <1, 2>$

But the solution he gave was $(1/3) \cdot <1,2>$. So apparently our answers are different.

So which method is correct?

2

There are 2 best solutions below

0
On BEST ANSWER

Let’s untangle your notation a bit and call the eigenvector that you’re using $v$. If your approach were correct, then it should hold for all $n$. In particular, $p_0=A^0v=\lambda^0v=v$, so this clearly can’t work for an arbitrary initial distribution $p_0$.

2
On

Your solution is incorrect because you restrict to a particular value for $p_0$. It is true that if $v$ is an eigenvector of $A$, then you have: $$A^n v=\lambda^n v$$ But you have to compute $A^n p_0$ instead, and you cannot choose $p_0$ because it is part of the problem statement. In other words, you cannot assume $p_0=v$, so the first line of your solution is false.