Consider the following recurrence relations for the sequences $a_n$ and $b_n$:
$ (\gamma_1+ \alpha_1 + n \gamma) a_n = \lambda_1 a_{n-1} + \alpha_2 b_n $
$ (\gamma_2+ \alpha_2 + n \gamma) b_n = \lambda_2 b_{n-1} + \alpha_1 a_n $
subject to the initial condition, say, $a_0 = k_1 $ and $b_0=k_2$.
I want to reexpress the above system in the matrix form, such as
$ A_{n+1} = B A_n $,
where $A_n$ is the vector $(a_n, b_n)^T$ and $B$ is the coefficient matrix. Note that the coefficient matrix commutes with itself for all $n$. Also, if possible, I want to go further and have both a diagonal matrix and a coefficient matrix on the right-hand side of the equality.
Any help would be appreciated!
Edit:
I am looking for a matrix-based generalisation of the solution to the above system. That is, What if I had one more sequence, say $c_n$, and therefore the following system:
$ (\gamma_1+ \alpha_1 + \beta_1 + n \gamma) a_n = \lambda_1 a_{n-1} + \alpha_2 b_n + \alpha_3 c_n$
$ (\gamma_2+ \alpha_2 + \beta_2 + n \gamma) b_n = \lambda_2 b_{n-1} + \alpha_1 a_n + \beta_3 c_n$
$ (\gamma_3+ \alpha_3 + \beta_3 + n \gamma) c_n = \lambda_3 c_{n-1} + \beta_1 a_n + \beta_3 b_n$
It's an ugly mess, but I got through it:
$$\begin{align*}&(\gamma_1+\alpha_1+n\gamma)a_n=\lambda_1a_{n-1}+\alpha_2b_n \\&(\gamma_2+\alpha_2+n \gamma)b_n=\lambda_2b_{n-1}+\alpha_1a_n \\&\implies a_n=\frac{\lambda_1a_{n-1}+\alpha_2b_n}{\gamma_1+\alpha_1+n\gamma} \\&\implies b_n=\frac{\lambda_2b_{n-1}+\alpha_1a_n}{\gamma_2+\alpha_2+n\gamma} \\&\implies a_n(\gamma_1+\alpha_1+n\gamma)=\lambda_1a_{n-1}+\frac{\lambda_2\alpha_2b_{n-1}+\alpha_1\alpha_2a_n}{\gamma_2+\alpha_2+n\gamma} \\&\implies b_n(\gamma_2+\alpha_2+n \gamma)=\lambda_2b_{n-1}+\frac{\lambda_1\alpha_1a_{n-1}+\alpha_1\alpha_2b_n}{\gamma_1+\alpha_1+n\gamma}\end{align*}$$
For convenience, let:
$$\begin{align}\sigma&=\gamma_1+\alpha_1+n\gamma\\\tau&=\gamma_2+\alpha_2+n\gamma\\\omega&=\sigma\tau-\alpha_1\alpha_2\end{align}$$
Now if we skip over an ugly in-between step,
$$a_n\cdot\omega=a_{n-1}(\lambda_1\cdot\tau)+b_{n-1}(\lambda_2\cdot\alpha_2) \\b_n\cdot\omega=a_{n-1}(\lambda_1\cdot\alpha_1)+b_{n-1}(\lambda_2\cdot\sigma)$$
And with a pretty sort of symmetry, we get:
$$\begin{bmatrix}a_n\\b_n\end{bmatrix}=\frac{1}{\omega}\begin{bmatrix}\lambda_1\cdot\tau&\lambda_2\cdot\alpha_2\\\lambda_1\cdot\alpha_1&\lambda_2\cdot\sigma\end{bmatrix}\begin{bmatrix}a_{n-1}\\b_{n-1}\end{bmatrix}$$
For factorisation, this is the best I can do:
Let $\sigma'=\gamma_1+\alpha_1$ and $\tau'=\gamma_2+\alpha_2$:
$$\begin{bmatrix}a_n\\b_n\end{bmatrix}=\frac{1}{\omega}\left(\begin{bmatrix}\lambda_1\cdot\tau'&\lambda_2\cdot\alpha_2\\\lambda_1\cdot\alpha_1&\lambda_2\cdot\sigma'\end{bmatrix}+n\gamma\begin{bmatrix}\lambda_1&0\\0&\lambda_2\end{bmatrix}\right)\begin{bmatrix}a_{n-1}\\b_{n-1}\end{bmatrix}$$
And by the distributivity of matrix multiplication:
$$\begin{bmatrix}a_n\\b_n\end{bmatrix}=\frac{1}{\omega}\begin{bmatrix}\lambda_1\cdot\tau'&\lambda_2\cdot\alpha_2\\\lambda_1\cdot\alpha_1&\lambda_2\cdot\sigma'\end{bmatrix}\begin{bmatrix}a_{n-1}\\b_{n-1}\end{bmatrix}+\frac{n\gamma}{\omega}\begin{bmatrix}\lambda_1&0\\0&\lambda_2\end{bmatrix}\begin{bmatrix}a_{n-1}\\b_{n-1}\end{bmatrix} $$
And if you're running a computer analysis or something that requires you to extrapolate future $a_n$ maybe this helps, since the $1/\omega$ terms can be easily computed and collected as functions of $n$, and the left matrix is constant, and the right matrix is only very trivially a function of $n$.