How fast does a matrix expression go to zero?

59 Views Asked by At

With $0<a,b,c<1$, \begin{align} \begin{bmatrix} X_n \\ Y_n \end{bmatrix}=\begin{bmatrix} a^2 & (1-b)^2 \\ (1-a)^2 & b^2 \end{bmatrix}^n \begin{bmatrix} c^2 \\ (1-c)^2 \end{bmatrix} \end{align} How fast does $X_n+Y_n$ go to zero?

This is coming from a Markov chain problem which I solved up to here. We can see that $X_n\leq X_{n-1}$ and $Y_n\leq Y_{n-1}$. I am thinking of using the results of this, but it does not seem easy. Any idea how to continue?

2

There are 2 best solutions below

5
On BEST ANSWER

Call your $2\times2$ matrix $A$. Since $a,b\in(0,1)$, all column sums of $A$ are strictly smaller than $1$, i.e. $\|A\|<1$ when the induced $1$-norm is taken. Hence $\|A^n\|\le\|A\|^n$. Since all norms are equivalent, the vector $1$-norm of $A^n$, i.e. the maximum magnitude of all entries of $A^n$, is $O(\|A\|^n)$. Hence $|X_n+Y_n|=O(\|A\|^n)$ too.

7
On

I would use eigenvectors and eigenvalues. These are pairs $v$ and $\lambda$, that fulfill $Av=v\lambda$ with $A$ being your matrix. Since you have a $2\times2$ matrix, I assume you will end up with a complex pair of values $\lambda_{1,2}$.

With this, you can show, that $\begin{bmatrix} X_n \\Y_n\end{bmatrix} \leq |\lambda|\begin{bmatrix}X_{n-1} \\ Y_{n-1} \end{bmatrix}$ You will get geometric convergence to zero for both entries and therefore the sum.