Big-O Induction

1.1k Views Asked by At

In my Algorithms course, our first assignment is a set of induction problems. I learned (very poorly) how to do this in my discrete mathematics course two years ago, but it wasn't a very comprehensive course.

My first problem is to check equivalence of the following:

$$c^{(n+b)} = O(c^n)$$ where b and c are both constants, c > 1 and b $\geq$ 1.

My base case set c=2, n0=3, q=100 and b=4, producing:

$$2^{3+4} \leq 100 * 2^3$$

Which simplifies to:

$$128 \leq 800$$

Which is true, so I can do the k+1 step. This is where I get really confused, although my understanding is that all this is, is algebra. I do my substitutions and get here:

$$c^{(k+1)+b} \leq q*c^{k+1}$$

From this point, I think I remember having to make the left side of the k+1 equation look like the original left side (except k instead of n), so I think I can do this:

$$c^kc^1c^b \leq qc^kc^1$$

Assuming I can do that, then I can divide either side by $c^1$ and get this:

$$c^{k+b} \geq qc^k$$

But here's where I'm stuck. How do I prove/disprove this?

Edit:

This is actually the ending point. Because $q$, $c$, $b$, and consequently $c^b$ are constants, the statement reduces to:

$$c^k \geq c^k$$

Which is true.

2

There are 2 best solutions below

5
On BEST ANSWER

Are you required to use induction here? This is not a particularly difficult problem, as Big-O problems go!

Remember that $f(n)=O(g(n))$ if there is a constant $C$ and $N>0$ such that $\lvert f(n)\rvert\leq C\cdot g(n)$ for all $n\geq N$. (In other words: $f(n)$ is eventually bounded by a constant times $g(n)$.)

Now, note that $c^{n+b}=c^bc^n$... and $c^b$ is a constant!

2
On

Trying to give a comprehensive introduction into the notation at first.
When you wirte $f = O(g)$, you actually mean $f\in \mathcal{O}(g)$, where $f,g$ are functions on the same domain $f,g: D \to \mathbb{R}$.
Also $\mathcal{O}$ is used to express that something grows at most as much as something else (1.), or that something decreases at least as fast as something else (2.), so you need to define some sort of limit point at wich your $\mathcal{O}$ ("order") is interesting. Call this $x_0 \in D$ we have two cases for the actual definition of $\mathcal{O}$:

  1. $x_0 < \infty$, usually $x_0 = 0$, for example for numerical algorithms' accuracy $\mathcal{O}(h^p)$ is common.
    $$f\in \mathcal{O}(g) :\Leftrightarrow f(x) \leq C g(x) \qquad \text{for all } x \text{ "close enough" to } x_0$$
  2. $x_0 = \infty$, usually for computation time etc. $$f\in \mathcal{O}(g) :\Leftrightarrow f(x) \leq C g(x) \qquad \text{for "sufficiently large" } x$$

Your special case is very simple, and implicitly takes $x_0 = \infty$. What you show is $$c^{n+b} = c^b \cdot c^n = C c^n \leq C c^n \qquad \forall\ n \in \mathbb{N}$$ (the $\leq$ is trivial here) So we found a constant $C = c^b$ which gives us $$c^{n+b} \leq C c^n \qquad \forall\ h\in\mathbb N$$ and this, by definition, means $$c^{n+b} \in \mathcal{O}(c^n) \qquad (c^{n+b} = \mathcal{O}(c^n))$$ So the proof basically consists of realising that for $c,b$ constant, $c^b$ is also constant.