Can a left eigenvector be uniquely given by the right eigenvector?

238 Views Asked by At

Excluding the trivial case of symmetric matrices, do matrices with a given right (left) eigenvector have a uniquely identifiable left (right) eigenvector? Specifically in the case where eigenvalues are non-degenerate.

Some things I know: the matrix of left and right eigenvectors of any (?) matrix, $L,R$ have the property that $LR=1$. Consequently, the left (right) eigenvector is orthogonal to the space spanned by the non-corresponding right (left) eigenvectors. Therefore, if we are given all of the left (right) eigenvectors, we immediately know that the right (left) eigenvector must be orthogonal to the hyperplane spanned by the full set of left (right) eigenvectors, excluding the corresponding right (left) eigenvector from said set. But, if we are not given all of the left (right) eigenvectors; what can we do? I also realize this is a question concerning the eigenspace of the transposed matrix:

For instance, if $Ax=\lambda x$ and $y^T A = \lambda y^T$, i.e. $Ax = \lambda x$ and $A^T y = \lambda y$. Given $A,\lambda, x$, can we immediately know $y$ (say, up to normalization)?

More specifically, I am studying the case of substochastic irreducible matrices, $P$ satisfying the Perron-Frobenius theorem and I am curious of course about the dominant eigenvalue case; $Pv = \rho v$ and $P^T u = \rho u$, with $\rho < 1$. Now we have additional constraints that could help: if $u,v$ are normalized such that $\sum_i v_i =1 $, $\sum_i u_i v_i=1$ and $u,v$ are positive vectors, elementwise.

It seems like these are quite a few constraints that might allow for identification of the corresponding transpose vector. What is the necessary condition on the problem (or on the matrix, say) in order to make the identification of corresponding vectors?

Footnote: In symmetric matrices, left and right eigenvectors are identical (as can easily be seen upon writing the eigenvalue equation) thus the identification question is trivial.