Evaluate $\lim_{x\to\infty}\frac{\cos x+\sin x}{x^2}$
Method $1$:
Numerator is small. Denominator is much bigger. So, limit is zero.
Method $2$:
Using series expansion,
$\lim_{x\to\infty}\frac{\left(1-\frac{x^2}{2!}+\frac{x^4}{4!}+...\right)+\left(x-\frac{x^3}{3!}+\frac{x^6}{6!}+...\right)}{x^2}$
Now, the numerator is a polynomial. And its degree is higher than that of denominator. So, the answer is $\infty?$
What's wrong in this method?
I first thought the series expansion can be used only when $x$ tends to zero. But perhaps this is not the case. These expansions are valid for all $x?$
Or, maybe the numerator can be manipulated in some way so that the final answer is zero?
The problem here is that the numerator is not a polynomial. When we do series expansion, we truncate the series, and then add an error term (usually in the form $O(x^n)$), then notice the error vanishes fast enough that we can drop it. That error term is only justified by Taylor’s theorem, which tells us that, as $x \to a$, the difference between the function and the Taylor series tends to $0$, and it actually explicitly tells us the error will be $O(x^n)$ for a polynomial of degree $n - 1$.
The thing is, for any Taylor polynomial $T_n(x)$ of $\sin(x)$, we have that $\lim_{x \to \infty}|\sin(x) - T_n(x)| = \infty$, so we certainly aren’t justified in replacing the series by any polynomial. When you have a whole series, normal polynomial manipulation rules don’t apply (the comments discuss why). For any function $f$, if you’re Taylor expanding at $0$, you’d need to prove the convergence is uniform (outside some bounded set) to apply Taylor’s theorem, which really doesn’t happen often.
The normal way to do limits towards infinity with Taylor expansion is substitute $t = 1/x$, take the limit as $t \to 0^+$, then you can expand everything (and the errors will approach $0$), and you’ll find the right value. Taylor's theorem and evaluating limits when x goes to infinity This thread shows an example. Note, however, this won’t work in your case, since $\sin(1/t)$ doesn’t have a Taylor series at $t = 0$. So, unless someone else comes with a clever manipulation, I think the best bet is to just do squeeze theorem for this one.