Suppose we know that
$\lim_{n \rightarrow \infty} (f_n / g_n) = 1$,
for some sequences $\{f_n\}$ and $\{g_n\}$. Does this imply
$\lim_{n \rightarrow \infty} f_n - g_n = 0$ ?
I assume that the answer is false but I have no intuition for how the first condition doesn't imply the second. Perhaps there's a good counterexample?
Context: I ask because this issue comes up when Efron (1975) explains the difference between estimators with first and second order efficiency, see Equation 1.1. Apparently $\lim_{n \rightarrow \infty} (f_n / g_n) = 1$ is a weaker condition.
Efron, Bradley. "Defining the curvature of a statistical problem (with applications to second order efficiency)." The Annals of Statistics (1975): 1189-1242.
Take for instance $f_n=2^n+ (-1)^n$ and $g_n=2^n$.
Then $f_n-g_n=(-1)^n$ so it is not even convergent.
One can play around, and have something like $f_n= 2^n+z_n$ where $z_n=\text{o}(2^n)$, and $g_n=2^n$.
Of course if $f_n$ convergences to a real number then $\lim_n f_n-g_n=0$