Help with eigen vectors and Eigenvalues

290 Views Asked by At

I am using numpy and my own excercise answers and in both i am getting totally different answers.

The matrix in question is $\begin{bmatrix}0.6 & 0.9\\0.4 & 0.1\end{bmatrix}$

could someone take me through the steps? I cannot seem to get to the answer of Eigenvalues of (1,-0.3), my sheet seems to assume that it is a markov matrix and therefore uses the trace and the fact that lambda_1 will always be 1 yet i cant find it through the characteristic equations.

any helps would be appreciated

3

There are 3 best solutions below

0
On

To find the Eigenvalues of matrix $M$ you can compute the following: $\det(M- \lambda I) = 0$, and then solve for the values of $\lambda$.

$$\det\left(\begin{bmatrix}0.6 & 0.9\\0.4 & 0.1\end{bmatrix} - \begin{bmatrix} \lambda & 0\\0 & \lambda \end{bmatrix}\right)=\det\left(\begin{bmatrix} 0.6-\lambda & 0.9\\0.4 & 0.1-\lambda \end{bmatrix}\right)=0$$

1
On

Computing this in Python

import numpy as np
from numpy import linalg as LA

A  = np.matrix('.6,.9;.4,.1')
w,v = LA.eig(A)

v
Out[5]: 
matrix([[ 0.91381155, -0.70710678],
        [ 0.40613847,  0.70710678]])

w
Out[6]: array([ 1. , -0.3])

computing this by hand

$$det(A-I\lambda) = \begin{vmatrix} 0.6 -\lambda & 0.9 \\ .4 & .1 -\lambda \end{vmatrix} $$ $$ (0.6-\lambda)(0.1-\lambda) - (0.9) (0.4) $$

$$ \lambda^{2} -.7\lambda -.3 = $$ $$ (\lambda - 1)(\lambda+.3)$$ $$\lambda_{1} = 1, \lambda_{2} =-.3 $$

0
On

Because your haven't shown the source of your difficulty, I will show you results from R about eigen-values and eigen-vectors of your matrix $\mathbf{A}.$

A = matrix(c(.6, .9,  .4, .1), nrow=2, byrow=T);  A
     [,1] [,2]
[1,]  0.6  0.9
[2,]  0.4  0.1
eigen(A)

$values
[1]  1.0 -0.3

$vectors                      # read columns
          [,1]       [,2]
[1,] 0.9138115 -0.7071068
[2,] 0.4061385  0.7071068

Your matrix $\mathbf{A}$ is not a Markov transition matrix in the traditional notation, in which rows are conditional distributions and must add to $1.$ Recently, I have seen a few texts in which transition matrices are shown as the transpose of the classical transition matrices. Then it's the columns of the transition matrices that need to sum to unity. If you need help matching your matrix or its eigen-decomposition with a Markov chain, please clarify the meaning of the matrix.

In particular, if your 2-state Markov chain has $$P(X_{i+1} = 2 | X_i = 1) = 0.4 \;\; \text{and}\;\; P(X_{i+1} = 1 | X_i = 2) = 0.9,$$ then you can find the long-run distribution $\mathbf{\lambda}$ of this ergodic chain with the following additional computation:

vec = eigen(A)$vectors[,1]
lam = vec/sum(vec); lam
[1] 0.6923077 0.3076923

lam %*% t(A)                # '%*%' denotes matrix multiplication
          [,1]      [,2]
[1,] 0.6923077 0.3076923

The first eigenvector (smallest modulus) is proportional to $\lambda.$ So upon norming, we get $\lambda\mathbf{A^T} = \lambda,$ where $\lambda = (0.6923, 0.3077).$