Complexity of computing $N!$

940 Views Asked by At

Question:
Complexity of computing $N!$, considering that each multiplication cost about $O(\log^2{n})$.

Attempt:
There's $n-1$ multiplication. Each multiplication leads to a bigger number, thus $n-1$ multiplication costs,

$$\log^2{n} + \log^2{(n\times (n-1))} + \dots + \log^2{(n!)} = \\ \log^2{(n^n*(n-1)^{n-1}*(n-2)^{n-2}\times\dots\times1)}$$

I'm just wondering if I'm on the right path, it seems it's blowing up pretty fast.

1

There are 1 best solutions below

0
On BEST ANSWER

There are two things wrong. First, for large n, two numbers x <= n can be multiplied a lot faster than O (log^2 n); it's more like O (log n log log n).

Second, to calculate N! quickly, you wouldn't start with 1, multiply by 2, by 3, by 4 and so on. Instead, to calculate the product of N numbers, you multiply the numbers in groups of 2, multiply the results in groups of 2 and so on, so that the sizes of the products are kept small.

And to keep things sane, instead of a formula for complexity of multiplying numbers <= n, use a formula for the complexity of multiplying n bit numbers.

Assuming that k bit numbers are multiplied in O (k log k) steps: To calculate N!, the final product where we multiply two products of N/2 numbers will multiply two (N/2 log N) bit numbers, taking O (N/2 log^2 N) = O (N log^2 N) steps. Before that we multiply two pairs of (N/4 log N) bit numbers, taking O (2 N/4 log^2 N) or O (N log^2 N) steps. Before that we multiply four pairs of (N/8 log N) bit numbers etc. This all adds up to O (N log^3 N) steps.

If we use slower multiplication algorithms, most of the work is done in the last step alone, so the time is about the same as for one product of two (N/2 log N) numbers.