Let us fix a quadratic integer $\lambda=a+b\sqrt \Delta$ with $a,b\in\mathbb Z^*$ and define the function $$f(t)=\frac{\lambda^t-\lambda^{-t}}{\lambda - \lambda^{-1}}.$$ I try to study the growth of the function $$g(n) = \sum_{t=1}^{\log n}\sqrt{\frac{\gcd(n,f(t))}{n}}$$ as $n\to\infty.$ A trivial bound would be $O(\log n)$ but intuitively I expect $\gcd$ to grow like $\log n$ whenever $t=O(1)$ and then obtain a sharper estimate.
How can one estimate $\gcd(n,f(t))$ effectively in that case?
EDIT: I'm particularly interested in $n$s of the form $n_k=\gcd(f(k),f(k-1)+1)$. For such values we'd just get $$g(n_k)\le C\sqrt{n^{-1}}\sum_t \lambda^{\frac t2}=O(1)$$ but again I'm not sure such estimate is optimal.