With $0<a,b,c<1$, \begin{align} \begin{bmatrix} X_n \\ Y_n \end{bmatrix}=\begin{bmatrix} a^2 & (1-b)^2 \\ (1-a)^2 & b^2 \end{bmatrix}^n \begin{bmatrix} c^2 \\ (1-c)^2 \end{bmatrix} \end{align} How fast does $X_n+Y_n$ go to zero?
This is coming from a Markov chain problem which I solved up to here. We can see that $X_n\leq X_{n-1}$ and $Y_n\leq Y_{n-1}$. I am thinking of using the results of this, but it does not seem easy. Any idea how to continue?
Call your $2\times2$ matrix $A$. Since $a,b\in(0,1)$, all column sums of $A$ are strictly smaller than $1$, i.e. $\|A\|<1$ when the induced $1$-norm is taken. Hence $\|A^n\|\le\|A\|^n$. Since all norms are equivalent, the vector $1$-norm of $A^n$, i.e. the maximum magnitude of all entries of $A^n$, is $O(\|A\|^n)$. Hence $|X_n+Y_n|=O(\|A\|^n)$ too.