Suppose that $A$ and $B$ are square matrices with the proper size, then what kind of condition does $B$ have to satisfy such that $A$ and $AB$ share the same smallest(largest) nonzero eigenvalue and corresponding eigenvector?
The detailed problem is as follows:
Let $A \in \mathcal{R}^n$ is Hermitian. Choose a nonzero vector $x_0$ and construct matrix $X=[x_0 \ \ Ax_0 \ \ A^2x_0 \cdots \ A^{n-1}x_0]$. Denote the rank of $X$ as $ r = rank(X)$, then an interesting observation is when $r<n$, the smallest(and largest) nonzero eigenvalue (with corresponding eigenvectors) of $A$ and $AXX^{+}$ are the same where $X^+$ is the pseudoinverse of $X$. But, I have no idea about how to show that technically.
If $v$ is the eigenvector and $\lambda$ the eigenvalue, this says $Av = ABv = \lambda v$. In particular, $A (Bv - v) = 0$, so if $A$ is invertible then $Bv = v$.
EDIT: Perhaps I don't understand what you mean by "the smallest (and largest) nonzero eigenvalue". The smallest and largest eigenvalues for $A$ might not be eigenvalues for $AX X^+$. What are shared are the nonzero eigenvalues of $A X X^+$ and an eigenvector corresponding to each of these.
For example, take
$$ A = \pmatrix{1 & 0 & 0 & 0\cr 0 & 2 & 0 & 0\cr 0 & 0 & 3 & 0\cr 0 & 0 & 0 & 4}, x_0 = \pmatrix{0\cr 1\cr 1\cr 0}$$
$$ X = \pmatrix{0 & 0 & 0 & 0\cr 1 & 2 & 4 & 8\cr 1 & 3 & 9 & 27\cr 0 & 0 & 0 & 0\cr}, X X^+ = \pmatrix{0 & 0 & 0 & 0\cr 0 & 1 & 0 & 0\cr 0 & 0 & 1 & 0\cr 0 & 0 & 0 & 0\cr}, A X X^+ = \pmatrix{0 & 0 & 0 & 0\cr 0 & 2 & 0 & 0\cr 0 & 0 & 3 & 0\cr 0 & 0 & 0 & 0\cr} $$
So in this case $A$ and $X X^+$ do share two eigenvalues and eigenvectors, but they are not the largest and smallest for $A$.
Write $A = U D U^*$ where $U$ is unitary and $D$ diagonal with diagonal entries $\lambda_i$, the eigenvalues of $A$. The columns of $U$ are eigenvectors of $A$ corresponding to these eigenvalues. Let $x_0 = U x$. Then $A^i x_0 = U D^i x$, and we find that $ X = U T V$ where $V$ is the Vandermonde matrix with entries $V_{ij} = \lambda_i^{j-1}$ and $T$ is the diagonal matrix with diagonal entries $T_{ii} = x_i$. Now $X X^+$ is the orthogonal projection on the column space of $X$, which is the image under $U$ of the column space of $TV$. $V$ has full rank if the eigenvalues of $A$ are distinct, but the rows corresponding to equal eigenvalues are the same. The diagonal matrix $T$ multiplies the rows by the entries of $x$, and in particular any rows corresponding to $0$'s in $x$ are made $0$.
For any unique eigenvalue $\lambda_i$ such that $x_i \ne 0$, the unit vector $e_i$ is in the column space of $TV$, so $U e_i$, the $i$'th column of $U$, which is an eigenvector of $A$ for $\lambda_i$, is in the column space of $X$, and since $X X^+ U e_i = U e_i$ it is also an eigenvector of $A X X^+$ for the same eigenvalue. For a non-unique eigenvalue (say $\lambda_i = \lambda$ for $i$ in some set $S$), the vector $v$ with $v_i = x_i$ for $i \in S$ and $0$ otherwise is in the column space of $TV$. If $v \ne 0$, $U v$ is a nonzero linear combination of eigenvectors of $A$ for eigenvalue $\lambda$ and therefore is also such an eigenvector, and again $X X^+ U v = U v$ so it is also an eigenvector of $A X X^+$ for the same eigenvalue.