Optimizing an asymptotic recurrence relation with two recursive terms

286 Views Asked by At

I have a recurrence relation that looks like this:

$T(n) = 2 T(c n) + T((1-c)n) + O(1)$

The base case is just $T(b) = 1$ when $b \leq 1$.

I'm trying to figure out the best value of $c \in (0, 1)$ to use, to minimize the asymptotic growth rate.

For example, if I use $c=1/2$ then the recurrence simplifies to $3 T(n/2) + O(1)$ and it grows like $O(n^{lg_3 2}) \approx O(n^{1.585})$. (This is essentially the only case I've managed to solve, because it's the only one where the terms combine.)

I have found that changing the $c$ as I go can make solving easier. For example, alternating between $c_0, c_1 = \frac{1}{3}, \frac{1}{2}$ achieves $O(n^{lg_5 3}) \approx O(n^{1.465})$. In general if you cycle through $\frac{1}{p}, \frac{1}{p-1}, ..., \frac{1}{3}, \frac{1}{2}$ you get $O(n^{lg_p 2p-1}) \in O(n^{lg_p 2p}) = O(n^{1 + 1/lg_p 2}) \in O(n^{1 + \epsilon})$. This suggests that using a single $c$ will also allow getting arbitrarily close to linear, though we're hiding larger and larger constants so we can't just set $p=\Gamma^{-1}(n)$. And maybe the single best $c$ achieves $O(n \log(n))$, which would be better than $O(n^{1+\epsilon})$.

I've looked into how the branching plays out. I know that you end up summing terms like ${x \choose i} 2^i c^{x-i} (1-c)^x n$. But you need to iterate over the pairs of $x$ and $i$ that correspond to the final expansion before hitting 1 and even if I knew how to generate those analytically I doubt I'd be able to evaluate the resulting sum.

If it's the multiplying that's causing problems, transforming the problem into $T(p) = 2 T(p - lg(c)) + T(p - lg(1-c)) + 2^p$ might make things easier. I haven't found it helpful yet, though.

Any ideas?

1

There are 1 best solutions below

2
On

Use Leighton's version of the master theorem: Consider a recurrence of the form: $$ T(z) = g(z) + \sum_{1 \le k \le n} a_k T(b_k z + h_k(z)) $$ fo $z \ge z_0$, $a_k$ and $b_k$ constants, with the restrictions:

  1. There are enough base cases
  2. For all $k$, $a_k > 0$ and $0 < b_k < 1$
  3. There is a constant $c$ such that $g(z) = O(z^c)$ when $z \to \infty$
  4. For all $k$ it is $\lvert h_k(z) \rvert = O(z / (\log z)^2)$

Then if $p$ is such that: $$ \sum_{1 \le k \le n} a_k b_k^p = 1 $$ the solution to the recurrence satisfies: $$ T(z) = \Theta\left( z^p \left( 1 + \int_1^z \frac{g(u)}{u^{p + 1}} \, \mathrm{d} u \right)\right) $$

Here the hypotheses are satisfied, you want $p$ such that:

$$ 2 c^p + (1 - c)^p = 1 $$

Smallest value is $p = 1$ for $c = 0$.