I found an MIT video here (video between 5:18 - 9:52). The problem is to find what would $P_n$ be if we know A and $P_0$.
$$A = \begin{bmatrix} 0.6 & 0.2 \\ 0.4 & 0.8 \\ \end{bmatrix}$$
$p_0 =\begin{bmatrix} 1 \\ 0 \\ \end{bmatrix} $
1) The instructor in the video solved like this:
$P_n = A^n \cdot P_0 = PD^nP^{-1} \cdot P_0$
2) I would do it like this:
$p_n = A^n \cdot p_0 = \lambda^n \cdot p_k$
$p_k$ here is the corresponding eigen vector to the eigenvalue. I would take $\lambda = 1$.
So $p_n = 1^k \cdot <1, 2> = <1, 2>$
But the solution he gave was $(1/3) \cdot <1,2>$. So apparently our answers are different.
So which method is correct?
Let’s untangle your notation a bit and call the eigenvector that you’re using $v$. If your approach were correct, then it should hold for all $n$. In particular, $p_0=A^0v=\lambda^0v=v$, so this clearly can’t work for an arbitrary initial distribution $p_0$.