Inner product space and Reversible Markov chain

754 Views Asked by At

I am trying to clarify the following:

Suppose P is the $N\times N$ transition matrix of a finite-dimensional Markov Chain, with invariant distribution given by N-vector $\mu$ (i.e. $\mu^T=\mu^T P$). Define the time-reverse chain with transition matrix $\hat P$ as: $\hat p_{ij}=\frac{\mu_j}{\mu_i} p_{ji}$

Now define the reversible transition matrix $R=(P+\hat P)/2$. It can be shown that this reversible chain has the same invariant measure $\mu$.

Now if we define an inner product space with $<x,y>_\mu=\Sigma x_i y_i \mu_i$, then $R$ is self-adjoint in this space, with real eigenvalues and orthogonal eigenvectors (w.r.t the inner product defined above).

Somehow, numerically, when I try the above computation on a randomly generated Transition matrix, I do not get a self-adjoint R, and the eigenvectors are not orthogonal in the sense described above. Can anyone tell me if there is an error in above statements ?

Edit (in response to Bryon): Let me elaborate on my statements to see if there is any error

Let us define $D_\mu=Diag(\mu)$, a diagonal matrix with $\mu$ on the diagonal. Now define $A=D_\mu^{1/2}RD_\mu^{-1/2}$. It can be checked that A is symmetric matrix.

Then let eigenvectors of A be $\phi_j$, and eigenvectors of R be $f_j$. The relation between the two is: $f_j=D_\mu^{-1/2}\phi_j$

Now consider $\delta_{ij}=\langle \phi_i,\phi_j\rangle=\langle D_\mu^{1/2} f_i,D_\mu^{1/2} f_j\rangle=\langle f_i,f_j\rangle_\mu$

Edit #2 (Response to Did):

The reversibility of R w.r.t $\mu$ implies $R_{ij}\mu_i=R_{ji}\mu_j \:\: \forall i,j$

Now, the self-adjointness condition is $\langle Rx,y\rangle_\mu=\langle x,Ry\rangle_\mu$

L.H.S: $\Sigma_i \Sigma_j R_{ij}x_j y_i \mu_i=\Sigma_i \Sigma_j R_{ji}x_j y_i \mu_j$ (by the reversibility property above).

Then interchanging $i$ and $j$, we get , L.H.S=$\Sigma_i \Sigma_j R_{ij}x_i y_j \mu_i$, which is the same as R.H.S.

So the definition of inner-product I wrote seems to work here. I am still confused.

Final Edit: I feel stupid writing this, but i have tracked down my error to a faulty piece of Matlab code: apparently the command Eigs (based on Arnoldi algorithm to find top k eigenvalues) gives wrong results (such as complex eigenvalues/vectors). I found the whole spectrum using the other command "Eig" and it turns out R has real spectrum, and orthogonal eigenvectors w.r.t to the inner-product I described.

2

There are 2 best solutions below

5
On

You should divide, not multiply, by $\mu_i$ in the inner product: use $\langle x,y\rangle_\mu=\sum x_i y_i /\mu_i$.

0
On

I can recommend P.Diaconis and D.Strook paper "Geometric Bounds for Eigenvalues of Markov Chains". It made clear for me why reversible chains are even mentioned. (in the end of the proof of Proposition 3 in paper Persi makes things clear).

As to the Matlab, numerical calculations of eigenvalues should not be trusted. I think internet is full of simple $3 \times 3$ matrix examples with obvious real eigenvalues, which Matlab fails to find.