I'm having some issues understanding oblique asymptotes, especially when learning it as a natural progression after learning about horizontal asymptotes.
Consider the rational function which has been "long-divided" into proper form: $y=3+\frac{1}{x+1}$. As $x$ tends to infinity, $\frac{1}{x+1}$ tends to zero, thus $y$ tends to $3$, hence horizontal asymptote is $y=3$. Okay makes sense.
But what if an oblique asymptote exists, for instance when, after long division, $y=3+x+\frac{1}{x+1}$? If I apply the same logic as above, as $x$ tends to infinity, $\frac{1}{x+1}$ tends to zero, and $x$ tends to infinity thus $y$ should tend to $3+\infty$, which is $\infty$ hence oblique asymptote is $y=\infty$. I know it doesn't make sense - but therein lies my confusion. Why do we allow the $x$ in $\frac{1}{x+1}$ to tend towards infinity, but exempt the "other" $x$ (as in $3+x$) from doing so?
Instead of saying, "tends to infinity", say "when $x$ is large." When $x$ is large, $1/(x+1)$ is small. The larger $x$ is, the smaller $1/(x+1)$ is. So when $x$ is very large, $y \approx 3+x$.
You don't want to say $y=3+\infty$, because it implies that you somehow got all the way to infinity.