Simplify asymptotic notation

72 Views Asked by At

Premise for context: Reading a paper I stumbled upon the following expression $$n(1+p)^{O(\log_{\sigma}n)} + 2$$ which is claimed to be $O(n)$ when $p \in O(1/\log_{\sigma}n)$, under the (reasonable) assumption that $n > \sigma$. Substituting we get $$n\left(1+O\left(\frac{1}{\log_{\sigma}n}\right)\right)^{O(\log_{\sigma}n)} + 2$$

As we are dealing with asymptotic notation I would look only at the higher order terms (this passage actually turned out to be completely wrong, have a look at the comments for details) getting to

$$n\left(1+O\left(\frac{1}{\log_{\sigma}n}\right)^{O(\log_{\sigma}n)}\right) + 2$$

Let now $x=\log_{\sigma}n$ and discard the constant to get to $$O(n)+n \cdot O\left(\frac{1}{x}\right)^{O(x)}$$

end of premise, if you like more details feel free to ask them.

To verify the claim we need to check the value of $O\left(\frac{1}{x}\right)^{O(x)}$, which should turn out to be $O(1)$ to satisfy the initial claim.

Looking at the value of $y^{-y}$ which tends to 1 for growing values of $y$ in $\mathbb{N}$ (e.g. considering $y^{-y} = e^{-y\cdot \log y }$ and seeing that the value $y\cdot \log y$ goes to $\infty$).

This seems correct to me but I was wondering if my (not so formal) reasoning holds or not and whether there is a better line of reasoning to verify this.

1

There are 1 best solutions below

4
On BEST ANSWER

As $n\to\infty$, $$n(1+p)^{O(\log_{\sigma}n)}=\exp\left(\ln\left( n\right)+O(\log_{\sigma}n)\ln(1+p)\right)=n\exp(O(\log_{\sigma}n)\ln(1+p))$$ substitue as you did, but note that by Taylor's formula, $$\ln(1+O(1/\log_\sigma n))\sim O(1/\log_\sigma n)$$ as $n\to\infty$, then using the fact that $e^{O(1)}=O(1)$, i.e., it is bounded, $$n\exp(O(1))=O(n)$$ as $n\to\infty$. As you have noticed, discarding the constant $2$ which is $O(1)$ doesn't matter here.