Given a matrix $A$ of order $n$ with coefficients in $\mathbb{C}$, applying the shifted QR algorithm:
$$ \begin{aligned} & (Q, R) = \text{qrfactor}(A - \omega\,I_n)\,; \\ & A = R\cdot Q + \omega\,I_n\,; \end{aligned} $$
with $\omega \in \mathbb{C}$ assumed heuristically, after a sufficient number of iterations $A$ is transformed into an upper triangular matrix with the respective eigenvalues on the main diagonal. And so far all peaceful. The problem now is to associate an eigenvector to each eigenvalue.
In the particular case where $A$ is a symmetric matrix with real coefficients it's shown that the associated eigenvectors correspond to the column vectors of the matrix $\begin{aligned}\prod_{i=2}^{n_{\text{it}}}\end{aligned} Q_{i-1}\cdot Q_i$, with $Q_1$ corresponding to the unitary matrix obtained at the first iteration.
In general, however, the only way I know is to construct the block matrix $[(A-\lambda_i\,I_n)^t\,|\,I]$ for each distinct eigenvalue $\lambda_i$ of $A$, reduce it by rows by applying the Gauss-Jordan elimination method, obtaining a block matrix of the type $[B \, |\,C]$, where the eigenvectors of $A$ correspond to the rows of $C$ corresponding to the null rows of $B$.
The latter method, however, based on the Gauss-Jordan method, isn't numerically stable. Therefore, I was wondering if there was any better strategy for extrapolating all eigenvectors, hopefully similar to that for real symmetric matrices.
Thank you!