How can the jth level of a binary tree with n nodes has problems of size $({\frac{n}{2}})^j$?

198 Views Asked by At

I read from a book that the jth level (starting from j=0 or the root) of a binary tree with n nodes divides a problem into $2^j$ subproblems, each of size $\frac{n}{2^j}$. I understand where $2^j$ comes from, but where does $\frac{n}{2^j}$ come from?

$n$ includes nodes above the level, right? Hence, how come the sum of the sizes of the subproblems of the subtrees of the level is $n$? Shouldn't it be less than $n$?

1

There are 1 best solutions below

3
On BEST ANSWER

The idea is that if one node of a binary tree representing a divide-and-conquer algorithm has associated 'problem size' $n$, then the two nodes beneath it have problem sizes $x$ and $y$ with $x+y=n$; you split the problem of size $n$ into two sub-problems whose sizes add to $n$. In the best-case scenario, these two subproblems are the same size, which means that each of them is (roughly) $n/2$. By iterating this down you can see that each of the problems two levels beneath are of size roughly $n/4$, the problems on the level below that are of size roughly $n/8$, etc; since you have $2^j$ nodes at level $j$ but the overall size of the problem (the sum of the sub-problem sizes in those nodes) is still $n$, each of those sub-problems must be of size $n/2^j$ so that their sum can be $2^j\cdot(n/2^j)=n$.