Most integer factoring algorithms work by finding a nontrivial divisor of their input. In order to fully factor a given input, algorithms recursively apply the procedure for finding a nontrivial divisor until the number is fully factored.
What I'm interested in is bounding the number of recursive iterations. If we let $t(n)$ denote the running time for the factoring algorithm, with $n$ representing the value of the input, and $f(n)$ denote the running time of the divisor-finding procedure, then we have a recursive formula:
$$ t(n) = f(n) + t(x) + t(n/x) $$
with $x$ the nontrivial divisor the procedure found. It is easy to show that $t(x) + t(n/x) \le 2t(n/2)$, but this bound is not tight enough (i.e., for $f(n) = \log^k n$, $t(n) = \Theta(n)$ which is exponential).
I'm looking for references which show that give a bound on the time complexity of $t(n)$ in terms of $n, f(n)$.