In my current linear algebra class we were introduced to the/a "Geometric Series Trick". It has been used a few times in the class during some proofs, but I kind of glanced it over. I am not exactly sure what it means now, searching on Google doesn't bring up much except this.
My question is specifically in regards to a lemma we were provided in class on the board showing the convergence rate of the Rayleigh Quotient Iteration for Hermitian matrices. The proof starts by showing that at the $k$ iteration the current eigenvector guess $x^{(k)}_j$, which is within $\delta$ of the corresponding eigenvector solution $y_j$ of the matrix $A$, can be decomposed into a linear combination of the orthonormal eigenvectors which make up the Hermitian matrix; $x^{(k)}= y_j + \delta w$. Then substituting this into the definition of the Rayleigh Quotient Iteration and simplifying you come to
$$ \mu_k = \frac {\lambda + \delta^2 \frac{w^*Aw}{y^*y}} {1 + \delta^2 \frac{w^*w}{y^*y}} $$
After which you apply the "Geometric Series Trick" to find
$$ \mu_k = (\lambda + \delta^2 \frac{w^*Aw}{y^*y} ) + (1 - \delta^2 \frac{w^*w}{y^*y} + O(\delta^2)) $$
Then conclude by saying that $\mu_k = \lambda + O(\delta^2)$, that is, the eigenvector guess on the $k$ iteration which is within $\delta$ of an eigenvector means that the Rayleigh Quotient is within $O(\delta^2)$ of an eigenvalue.
Though, I don't understand what this "Geometric Series Trick" is or how it is applied here exactly?
The phrase refers to the formula for the sum of a geometric series (assuming $|r|<1$):
$$\frac{a}{1-r}=a(1+r+r^2+r^3+\cdots) = a(1+r+O(r))$$
In this case, your "$a$" is the whole numerator, and your "$r$" is $-\delta^2\frac{w*w}{y*y}$