Big O for Functions Approaching 0

1.1k Views Asked by At

$$f(x) = \text{the Taylor series approximation for } \sin(x)$$

$$f_2 (x) = x$$

where $f_2(x)$ is an approximation for $f(x)$ as x is the first term of $f(x)$. Then: $$g(x) = f(x) - f_2(x)$$ where the first term of $g(x)$ is $-(x^3)/6$.

What is the big $O$ of $g(x)$ as $x$ approaches $0$?

The professor of the lecture I'm watching claims it is $O(x^3)$ as the $x^3$ is the dominant term (largest valued term, slowest in approaching 0), which makes sense to me.

The second question is, $h(x) = 2x^2 + 27x + 1000$, what is the big $O$ as $x$ approaches $0$?

The professor claims it is $O(1)$ as the $1000$ is a constant.

I don't see why $g(x)$ can't be $O(1)$ for some delta where $0 < |x - 0| < \delta$, and I don't see why $h(x)$ can't be $O(x^2)$ by the same logic applied for $g(x)$ being $O(x^2)$.

Any help will be much appreciated, thanks!

2

There are 2 best solutions below

0
On

I like to think about big-$\mathcal O$ as the limit definition:

$f=\mathcal O(g)$ at $a\iff\limsup_{x\to a} \left|\frac{f(x)}{g(x)}\right|<\infty$.

Or, in words, "near" the point $a$ the function $g$ grow faster or equal to $f$.

So near $0$ we have that $x^n=\mathcal O(x^{n-h})$ for all $n$ positive $h$(and multiplying by constant doesn't matter) because$$\limsup_{x\to 0} \left|\frac{x^n}{x^{n-h}}\right|=\lim_{x\to 0} \left|\frac{x^n}{x^{n-h}}\right|=\lim_{x\to 0}|x^h|=0<\infty$$

0
On

The thing is, $x^3$ goes to $0$ slower than $x^4$ and so on. So you keep the lowest order.