Solving a system of recurrence equations based on a Markov process

88 Views Asked by At

I'm having some problem to solve a system of recurrence equations, which happens to represent a Markov process. Its matrix of coefficients is {(0,0.5,1),(1,0,0),(0,0.5,0)} and calculating its eigenvalues I got a real value and two complex. I'm using Mathematics for Economists as my main book and got no clue what to do when this happens (looked for help in Boyce and Prima, but got nothing).I tried to calculate the complex number eigenvectors but failed, despite getting the real eigenvalue right. Do I need to consider my real roots when calculating the eigenvectors for the complex numbers, once I have more than two roots? Is there a book with several examples of the system of recurrence equations? I found several for ODE, but nothing for recurrence equations? Thanks you!

1

There are 1 best solutions below

0
On

To answer your specific question, the only thing that’s different about complex eigenvalues of a real matrix is that they likely won’t have real eigenvectors. In particular, you compute those eigenvectors in the same way that you would for a real eigenvalue. In fact, you get to save a bit of work: being the roots of a polynomial with real coefficients, complex eigenvalues always come in conjugate pairs and their respective eigenvectors also come in conjugate pairs. Similarly to the way that a real eigenvector represents a line that is mapped to itself by the linear transformation via a scaling operation, the eigenvectors of the complex conjugate pair of eigenvalues represent a plane that gets mapped to itself by the linear transformation via a scaled rotation. In terms of Markov processes, complex eigenvalues indicate the presence of a periodicity of some sort.

The eigenvalues of your matrix, which you’ve already computed, are $1$ and $-\frac12(1\pm i)$. The eigenvectors of $-\frac12(1+i)$ are the null space of the matrix $$\begin{bmatrix}\frac12(1+i)&\frac12&1\\1&\frac12(1+i)&0\\0&\frac12&\frac12(1+i)\end{bmatrix},$$ which row-reduces to $$\begin{bmatrix}1&0&-i\\0&1&1+i\\0&0&0\end{bmatrix}$$ from which we can read that the null space is spanned by $[-i,1+i,-1]^T$. This also tells us without any further work that the eigenspace of $-\frac12(1-i)$ is the span of $[i,1-i,-1]^T$, its complex conjugate. We also have the span of $[2,2,1]^T$ as the eigenspace of $1$, so if you’re trying to compute powers of the transition matrix $P$, you now have $$P = \begin{bmatrix}2&-i&i \\ 2&1+i&1-i \\ 1&-1&1\end{bmatrix} \begin{bmatrix} 1&0&0 \\ 0&-\frac12(1+i)&0 \\ 0&0&-\frac12(1-i) \end{bmatrix} \begin{bmatrix}2&-i&i \\ 2&1+i&1-i \\ 1&-1&1\end{bmatrix}^{-1},$$ therefore $$P^n = \begin{bmatrix}2&-i&i \\ 2&1+i&1-i \\ 1&-1&1\end{bmatrix} \begin{bmatrix} 1&0&0 \\ 0&-\frac1{2^n}(1+i)^n&0 \\ 0&0&-\frac1{2^n}(1-i)^n \end{bmatrix} \begin{bmatrix}2&-i&i \\ 2&1+i&1-i \\ 1&-1&1\end{bmatrix}^{-1},$$ which expands into some rather complicated expressions that involve complex numbers, but nevertheless evaluate to real numbers. Of course, if all you’re after is the steady-state distribution you just have to normalize an eigenvector of $1$: $\frac15[2,2,1]^T=\left[\frac25,\frac25,\frac15\right]^T$.

You don’t necessariy have to compute eigenvectors to find powers of $P$, though. A consequence of the Cayley-Hamilton theorem is that any analytic function $f$ of an $n\times n$ matrix $P$ can be written as an at most $(n-1)$-degree polynomial in $P$. As well, if $\lambda$ is an eigenvalue of $P$, then $f(\lambda)$ is an eigenvalue of $f(P)$. In particular, for a $3\times 3$ matrix $P$, $P^n=a_0I+a_1P+a_2P^2$, where the coefficients depend on $n$. These coefficients can be computed by solving the system of linear equations $a_0+a_1\lambda_i+a_2\lambda_i^2=\lambda_i^n$, where the $\lambda_i$ are the eigenvalues of $P$. (If there are repeated eigenvalues, these equations aren’t independent and have to be modified slightly.) For your particular matrix, we have $$a_0I+a_1P+a_2P^2 = \begin{bmatrix} a_0+\frac12a_2 & \frac12(a_1+a_2) & a_1 \\ a_1 & a_0+\frac12a_2 & a_2 \\ \frac12a_2 & \frac12 a_1 & a_0 \end{bmatrix}.$$ For powers of $P$, the coefficients are the solutions to the system $$\begin{align} a_0+a_1+a_2 &= 1 \\ a_0-\left(\frac12-\frac i2\right)a_1-\frac i2 a_2 &= \left(-\frac12+\frac i2\right)^n \\ a_0-\left(\frac12+\frac i2\right)a_1+\frac i2 a_2 &= \left(-\frac12-\frac i2\right)^n. \end{align}$$ I’m not sure that this really saves you any work over diagonalization, but it can be a handy way to compute specific powers of $P$. The powers of the eigenvalues on the right-hand sides of these equations are particularly easy to compute if you convert them into polar form.