I am working on something for Markov Chain mixing times and I am stuck on a piece of linear algebra.
Assume an $n \times n$ matrix M has all simple eigenvalues. Therefore, there is a basis of eigenvectors. Let $v_1$ be an eigenvector of norm 1. Is it always possible to find eigenvectors $v_2 \dots v_n$ all with norm 1 such that $v_1 \dots v_n$ is a basis for $\mathbb{R}^n$ and have the property that:
$e_i = v_1 + \sum_{k=2}^n a_kv_k$.
(the coefficient in front of $v_1$ is 1)
The paper I am reading assumes this is true but I am not certain why it must be true.
Thanks!
There must be more to this, because the claim as stated doesn’t hold. It’s tantamount to saying that, having chosen a unit eigenvector $v_1$, there is some eigenbasis (having unit vectors for the rest of it irrelevant) for which $e_i-v_1\in\operatorname{span}\{v_2,\dots,v_n\}$ for every standard basis vector $e_i$. In other words, the first coordinate of every standard basis vector relative to this basis is $1$.
As a simple counterexample, let $T:\mathbb R^2\to\mathbb r^2$ with one eigenspace spanned by $(1,1)^T$ and the other by $(0,1)^T$. Setting $v_1$ to $\frac1{\sqrt2}(1,1)^T$, $e_i-v_1\ne (0,k)^T$ for any $k$.
Now, if the matrix is stochastic, then there might be a version of this claim that does work, although I still doubt that you can choose any unit eigenvector for $v_1$. The fact that $1$ is always an eigenvalue of a stochastic matrix might play into it.