Is there a formula for expanding powers of $2\times 2$ matrices? I know that can be done by diagonalizing and then using the spectral decomposition of the matrix, but is there a general formula for $A^n$ in terms of $a,b,c$ and $d$, where $$A=\pmatrix{a&b\\c&d}$$ In the problem I am working on, $A$ is a stochastic matrix, i.e. all entries are non-negative and the row sum is $1$ for each row. I know such a formula exists for stochastic matrices that can be proved by induction, but what I really don't understand is the intuition for guessing that general formula.
Please tell me
1) Whether there is a general formula for powers all $2\times2$ matrices, and
2) Intuition for how to guess the general formula for powers in the special case of a stochastic matrix.
EDIT
As pointed out by zwim in comment, there is a formula for powers of general $2\times 2$ matrices, but that is very cumbersome. From the link he provides, we can see the general formula in terms of the eigen values $\alpha$ and $\beta$ of matrix $A$: $$A^n = \begin{cases}\alpha^n\left(\dfrac{A-\beta I}{\alpha-\beta}\right)+\beta^n\left(\dfrac{A-\alpha I}{\beta-\alpha}\right),&\text{ if }\alpha\ne\beta\\\alpha^{n-1}(nA-(n-1)\alpha I), &\text{ if }\alpha=\beta\end{cases}$$ I proved this by induction. So the new question is: Can you please provide intuition behind this guess for induction?
Some other proof of this formula without induction is also welcome.
The result for the first case (namely, $\alpha\ne\beta$) can be viewed as a direct consequence of Lagrange interpolation, but a wholesome approach to tackle both the case $\alpha\ne\beta$ and the case $\alpha=\beta$ is to divide $x^n$ by the characteristic polynomial $(x-\alpha)(x-\beta)$ of $A$ and then apply the factor theorem.
If we divide $x^n$ by $(x-\alpha)(x-\beta)$, the remainder is a constant or linear polynomial. That is, $$ x^n=(x-\alpha)(x-\beta)q(x)+r(x)\tag{1} $$ for some polynomials $q$ and $r(x)=ax+b$. By Cayley-Hamilton theorem, $(A-\alpha I)(A-\beta I)=0$. Therefore $(1)$ gives $A^n=r(A)=a+bI$ and the problem reduces to finding $a$ and $b$.
When $\alpha\ne\beta$, by substituting $x=\alpha$ and $x=\beta$ into $(1)$, we get \begin{cases} \alpha^n=r(\alpha)=a\alpha+b,\tag{2}\\ \beta^n=r(\beta)=a\beta+b. \end{cases} Solve them for $a$ and $b$, we get the relevant result.
When $\alpha=\beta$, differentiate $(1)$ to get $nx^{n-1}=2(x-\alpha)q(x)+(x-\alpha)^2q'(x)+r'(x)$. Therefore we can obtain $a$ immediately: $$ n\alpha^{n-1}=r'(\alpha)=a.\tag{3} $$ Substitute $(3)$ into $(2)$, we also obtain $b$.