I'm having some problem to solve a system of recurrence equations, which happens to represent a Markov process. Its matrix of coefficients is {(0,0.5,1),(1,0,0),(0,0.5,0)} and calculating its eigenvalues I got a real value and two complex. I'm using Mathematics for Economists as my main book and got no clue what to do when this happens (looked for help in Boyce and Prima, but got nothing).I tried to calculate the complex number eigenvectors but failed, despite getting the real eigenvalue right. Do I need to consider my real roots when calculating the eigenvectors for the complex numbers, once I have more than two roots? Is there a book with several examples of the system of recurrence equations? I found several for ODE, but nothing for recurrence equations? Thanks you!
2026-03-26 16:06:55.1774541215
Solving a system of recurrence equations based on a Markov process
88 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in RECURRENCE-RELATIONS
- Recurrence Relation for Towers of Hanoi
- Solve recurrence equation: $a_{n}=(n-1)(a_{n-1}+a_{n-2})$
- General way to solve linear recursive questions
- Approximate x+1 without addition and logarithms
- Recurrence relation of the series
- first order inhomogeneous linear difference equation general solution
- Guess formula for sequence in FriCAS
- Solve the following recurrence relation: $a_{n}=10a_{n-2}$
- Find closed form for $a_n=2\frac{n-1}{n}a_{n-1}-2\frac{n-2}{n}a_{n-2}$ for all $n \ge 3$
- Young Tableaux generating function
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
To answer your specific question, the only thing that’s different about complex eigenvalues of a real matrix is that they likely won’t have real eigenvectors. In particular, you compute those eigenvectors in the same way that you would for a real eigenvalue. In fact, you get to save a bit of work: being the roots of a polynomial with real coefficients, complex eigenvalues always come in conjugate pairs and their respective eigenvectors also come in conjugate pairs. Similarly to the way that a real eigenvector represents a line that is mapped to itself by the linear transformation via a scaling operation, the eigenvectors of the complex conjugate pair of eigenvalues represent a plane that gets mapped to itself by the linear transformation via a scaled rotation. In terms of Markov processes, complex eigenvalues indicate the presence of a periodicity of some sort.
The eigenvalues of your matrix, which you’ve already computed, are $1$ and $-\frac12(1\pm i)$. The eigenvectors of $-\frac12(1+i)$ are the null space of the matrix $$\begin{bmatrix}\frac12(1+i)&\frac12&1\\1&\frac12(1+i)&0\\0&\frac12&\frac12(1+i)\end{bmatrix},$$ which row-reduces to $$\begin{bmatrix}1&0&-i\\0&1&1+i\\0&0&0\end{bmatrix}$$ from which we can read that the null space is spanned by $[-i,1+i,-1]^T$. This also tells us without any further work that the eigenspace of $-\frac12(1-i)$ is the span of $[i,1-i,-1]^T$, its complex conjugate. We also have the span of $[2,2,1]^T$ as the eigenspace of $1$, so if you’re trying to compute powers of the transition matrix $P$, you now have $$P = \begin{bmatrix}2&-i&i \\ 2&1+i&1-i \\ 1&-1&1\end{bmatrix} \begin{bmatrix} 1&0&0 \\ 0&-\frac12(1+i)&0 \\ 0&0&-\frac12(1-i) \end{bmatrix} \begin{bmatrix}2&-i&i \\ 2&1+i&1-i \\ 1&-1&1\end{bmatrix}^{-1},$$ therefore $$P^n = \begin{bmatrix}2&-i&i \\ 2&1+i&1-i \\ 1&-1&1\end{bmatrix} \begin{bmatrix} 1&0&0 \\ 0&-\frac1{2^n}(1+i)^n&0 \\ 0&0&-\frac1{2^n}(1-i)^n \end{bmatrix} \begin{bmatrix}2&-i&i \\ 2&1+i&1-i \\ 1&-1&1\end{bmatrix}^{-1},$$ which expands into some rather complicated expressions that involve complex numbers, but nevertheless evaluate to real numbers. Of course, if all you’re after is the steady-state distribution you just have to normalize an eigenvector of $1$: $\frac15[2,2,1]^T=\left[\frac25,\frac25,\frac15\right]^T$.
You don’t necessariy have to compute eigenvectors to find powers of $P$, though. A consequence of the Cayley-Hamilton theorem is that any analytic function $f$ of an $n\times n$ matrix $P$ can be written as an at most $(n-1)$-degree polynomial in $P$. As well, if $\lambda$ is an eigenvalue of $P$, then $f(\lambda)$ is an eigenvalue of $f(P)$. In particular, for a $3\times 3$ matrix $P$, $P^n=a_0I+a_1P+a_2P^2$, where the coefficients depend on $n$. These coefficients can be computed by solving the system of linear equations $a_0+a_1\lambda_i+a_2\lambda_i^2=\lambda_i^n$, where the $\lambda_i$ are the eigenvalues of $P$. (If there are repeated eigenvalues, these equations aren’t independent and have to be modified slightly.) For your particular matrix, we have $$a_0I+a_1P+a_2P^2 = \begin{bmatrix} a_0+\frac12a_2 & \frac12(a_1+a_2) & a_1 \\ a_1 & a_0+\frac12a_2 & a_2 \\ \frac12a_2 & \frac12 a_1 & a_0 \end{bmatrix}.$$ For powers of $P$, the coefficients are the solutions to the system $$\begin{align} a_0+a_1+a_2 &= 1 \\ a_0-\left(\frac12-\frac i2\right)a_1-\frac i2 a_2 &= \left(-\frac12+\frac i2\right)^n \\ a_0-\left(\frac12+\frac i2\right)a_1+\frac i2 a_2 &= \left(-\frac12-\frac i2\right)^n. \end{align}$$ I’m not sure that this really saves you any work over diagonalization, but it can be a handy way to compute specific powers of $P$. The powers of the eigenvalues on the right-hand sides of these equations are particularly easy to compute if you convert them into polar form.