Big $\Theta$ arithmetic

429 Views Asked by At

I'm trying to understand this formula from this wikipedia article about amortized analysis.

In general if we consider an arbitrary number of pushes n + 1 to an array of size n, we notice that push operations take constant time except for the last one which takes $\Theta (n)$ time to perform the size doubling operation. Since there were n + 1 operations total we can take the average of this and find that pushing elements onto the dynamic array takes: $$ {\tfrac {n\Theta (1)+\Theta (n)}{n+1}}=\Theta (1)$$

For those who are not familiar with dynamic arrays and dynamic arrays in general, it does not matter in this case. I'm just giving some context about where the formula comes from but what I care about is the formula by itself.

Here's what I tried:

$${\tfrac {n\Theta (1)+\Theta (n)}{n+1}}= \tfrac {n\Theta(1)}{n+1} + \tfrac {\Theta(n)}{n+1} = \tfrac {\Theta(n)\Theta(1)}{\Theta(n+1)} + \tfrac {\Theta(n)}{\Theta (n+1)} = \tfrac {\Theta(n)\Theta(1)}{\Theta(n)} + \tfrac {\Theta(n)}{\Theta (n)} = \Theta (1)$$

I don't know if it's correct to replace n+1 to $\Theta(n+1)$. If it is, could someone explain why, because it only came from my intuition.

2

There are 2 best solutions below

5
On BEST ANSWER

Let me add explanation why we can change in expression $\frac{n\Theta (1)+\Theta (n)}{n+1}$ member $(n+1)$ to $\Theta(n):$

$\Theta(g)$ is set of functions $\left\lbrace f: \exists c_f,C_f>0, \ \exists N,\ n>N, c_fg \leqslant f \leqslant C_f g\right\rbrace$. We consider only non negative functions.

There are some properties outgoing from definition: $$f \cdot \Theta(g)=\Theta(g \cdot f)$$ $$\Theta(f) + \Theta(g)=\Theta(g + f)$$ And for $g>0$ holds $\frac{\Theta(f)}{g} = \Theta \left(\frac{f}{g}\right)$

Using these properties in mentioned fraction gives $$\frac{n\Theta (1)+\Theta (n)}{n+1}= \frac{\Theta (n)+\Theta (n)}{n+1} = \frac{\Theta (n)}{n+1}=\Theta \left( \frac{n}{n+1} \right)=\Theta(1)$$

0
On

The steps you've written are fine, but they are a bit opaque especially if you are not confident why you can take those steps. It is good for you to just write everything out explicitly using the definition of $\Theta(\cdot)$.

Let $p$ denote the constant time of a push operation, and let $g(n) \in \Theta(n)$ denote the time of the doubling operation, so that the amortized cost is $\frac{np + g(n)}{n+1}$. Explicitly, there exist positive constants $c,C$ such that $$cn \le g(n) \le Cn$$ for all sufficiently large $n$.

To show $\frac{np + g(n)}{n+1} \in \Theta(1)$, you want to find positive constants $c', C'$ such that $$c' \le \frac{np + g(n)}{n+1} \le C'$$ for sufficiently large $n$. This is simple if you just apply the definition of $g(n) \in \Theta(n)$ above.

$$c' := \frac{p+c}{2} \le \frac{np+cn}{n+1} \le \frac{np+g(n)}{n+1} \le \frac{np+Cn}{n+1} \le p+C =: C'.$$