Here's the text of the problem ( here $\lVert\cdot\rVert$ denotes any matrix induced norm): Let be $A\in \mathbb{R}^{n\times n}$ a diagonalisable matrix $n\times n$, with $\lambda_{1}, \lambda_{2},\ldots,\lambda_{n}$ real eigenvalues, such that $$\lvert\lambda_{1}\rvert>\lvert\lambda_{2}\rvert\geq\ldots\geq\lvert\lambda_{n}\rvert$$ with eigenvectors $v_1,\ldots,v_n$. Let $x_0=\sum_{i=1}^{n}\alpha_i v_i$ ($\alpha_1\neq0)$ and define recursively $$x_{k+1}=Ax_k $$ Moreover let $\mu_k=(w^TAx_k)/(w^Tx_k)$, where $w$ is a vector in $\mathbb{R}^n$ such that $w^Tv_1\neq0$. Prove that:
1) for all $\varepsilon$, it exists a $\bar{k}$ such that $$ \lVert Ax_k- \mu_k x_k \rVert<\varepsilon\lVert x_k\rVert$$ for every $k>\bar{k}$;
2) Suppose that the previous inequality holds for a fixed $\varepsilon$ and a fixed $k$. Show that it exists a matrix $\tilde{A}$ such that $\tilde{A}x_k=\mu_k x_k$ and $$ \lVert A-\tilde{A}\rVert<\varepsilon$$ End of exercise.
The point 1 is trivial, but on the other hand I don't know how to approach the second point. I noted that the matrix $\tilde{A}=(x_k w^TA)/(w^T x_k)$ (where $x_k w^TA$ is the product of a column for a row) satisfies the condition $\tilde{A}x_k=\mu_k x_k$, but I don't know how to prove the inequality with the norm (assuming that this matrix works). Someone could help me?
But in the exercise I don't have $\lVert\cdot\rVert_2$ but a generic induced norm, so $\lVert rx^T\rVert$ is equal to $\lVert r\lVert\lVert x\rVert^{\ast}$, where $\lVert\cdot\rVert^{\ast}$ is the dual norm