Asymptotic equivalence of two functions $g(x)$ and $h(x)$ is defined as
$$ \frac{g(x)}{h(x)} \to 1$$ as $x \to \infty$.
Consider $1+ \frac{1}{x}$ which converges to $1$ as $x \to \infty$. Consequently, $1+ \frac{1}{x} \sim 1$? So, to me it seems that the notion of asymptotic equivalence is rather useless in cases, where the function converges? Why is the notion still useful?
Now, for example, consider $ e^{-x} f(x)$, where $f(x) \uparrow c > 0$ as $x \to \infty$. Then, $$ e^{-x} f(x) \sim e^{-x} c, $$ correct? And I could therefore examine the asymptotic behavior of $e^{-x} c$ to find the asymptotic behavior of $e^{-x} f(x)$? That would mean that I could just always exchange factors that converge to a constant by the constant itself when considering asymptotic equivalence, is that correct?
I am just trying to understand the notion of asymptotic equivalence better by looking at these questions and would like to hear someone else's input. Thank you.
When replacing a function by its equivalent, you must be careful if you have a sum. for example, as you said
$$f(x)=\frac 1x +1 \sim 1 \;(x\to+\infty)$$
$$g(x)=\frac{1}{x^2}-1\sim -1 \;\;(x\to+\infty)$$ But
$$f(x)+g(x)\sim \frac{1}{x}\;\;(x\to+\infty)$$