Why do O(logn) & O(exp(n)) Have Polynomial & Non-Polynomial Running Time Complexities Respectively Despite Their Taylor Series?

470 Views Asked by At

I understand that a function, say $f(x)$, belongs to a class $O(g(x))$ iff: $$ \exists k > 0 \ \ \exists \ \forall n > n_0: |f(n)| \leq |g(n) \cdot k| $$

I also know that $log(x)$ is has polynomial running time complexity as can be bounded by some polynomial, say $k \cdot x^m$ where $k$ is a constant, while this doesn't hold true for $exp(x)$. However, what I cannot understand is when we bring our old dear friend into the play, the Taylor Series:

$$ exp(x) = \sum_{n=0}^{\infty} \frac{x^n}{n!} \ \ \ \ \ for \ all \ x $$

$$ log(1 + x) = \sum_{n=0}^{\infty} (-1)^{n+1} \frac{x^n}{n} \ \ \ \ \ for \ |x| < 1 $$

Now substituting the Taylor Series of each function respectively, we will obtain for some $m_1$, $m_2$ belonging to $\mathbb{R}$:

$$ \exists k_1 > 0 \ \ \exists \ \forall n > n_1: |exp(x) = \sum_{p=0}^{\infty} \frac{x^p}{p!}| \leq |x^{m_1} \cdot k| $$

$$ \exists k_2 > 0 \ \ \exists \ \forall n > n_2: |log(1 + x) = \sum_{p=0}^{\infty} (-1)^{p+1} \frac{x^p}{p}| \leq |x^{m_2} \cdot k| $$

The first of these two statements, is false since $exp(x)$ cannot be bounded by a polynomial. However since we can express $exp(x)$ as the sum of a polynomial, why can't it be bounded by another polynomial?

As for the second statement, that holds true but what I cannot understand intuitively is that how can an infinite sum of polynomial terms on the left ever be bounded by a single polynomial term on the right side? (Note this logic here directly challenges the reason I purposed in the previous paragraph about bounding but I am mentioning it because I think it is the source of the paradox I cannot solve)

1

There are 1 best solutions below

2
On

$\log(1+x)=\sum_p^\infty (−1)^{p+1}\frac{x^p}{p},$ only holds for $|x|<1$, whereas you are interested in $\log(n)$ for $n=1,2,3,\ldots$.