I gave an incorrect proof here : How can evaluate $\lim_{x\to0}\frac{\sin(x^2+\frac{1}{x})-\sin\frac{1}{x}}{x}$
I am confused as when considering the mistakes in my proof it seems the limit cannot be $0$. The method must thus be completely wrong even more wrong than the comments suggest.
Yet I believe there should be a big O proof possible and even one similar to the one I posted. The "paradox" Im getting at is perhaps clearer understood when considering that my method/the correct method should also work if sine is replaced by another function that has a Taylor series at $0$ with all $a_n$ larger or equal to $0$. (and for which the limit should also be $0$ ofcourse).
Very confused. Please keep in mind that I want a proof based on big O and not on trig identities or l'hopital rule.
A Taylor series of $f(x)$ around $x=0$ is of little heltp when estimating $f(x^2+\frac1x)$ as $x\to 0$. Note that $x^2+\frac1x\to\infty$. Rather you need the Taylor series developed around $\frac1x$ (and for general $f$ have to make sure that the constant hidden in the big-O does not depend on the point of developping): $$ \tag1\sin(x_0+h)=\sin(x_0)+h\cos(x_0)-\frac12h^2\sin(x_0)+O(h^3)$$ seems to give us $$\tag2 \sin(x^2+\frac1x)-\sin(\frac1x)=x^2\cos(\frac1x)-\frac12x^4\sin(\frac1x)+O(x^6)$$ but the constant hidden in $(1)$ may depend on $x_0$, that is in $(2)$ the constant depends on $x$ which of course makes the $O$ useless. However, one can argue that because of the periodicity, the $O$ is uniform in $x_0$ here.