Solving recursive equation without Master's Theorem

90 Views Asked by At

Cheers, I am asked to find the time complexity of a recursive algorithm, which splits the starting problem on $\sqrt{n}$ subproblems, of size $\sqrt{n}$ each, and then combines the solution in linear time.

My try for creating the equation is: $T(n) = \sqrt{n}T(\sqrt{n}) + n$, but I don't know if that is correct, or I have to do something like: $T(n) = \sqrt{n}T(\frac{n}{\sqrt{n}}) + n$. Other than that, I don't know how to proceed with either of them. Master's theorem can't be applied of course, but I am having trouble finding other ways, such as substitution. Can anyone provide some help? Thanks a lot.

2

There are 2 best solutions below

0
On

$$T(n) = n ^ {\frac{1}{2}}T(n ^ {\frac{1}{2}}) + n$$ $$T(n ^ {\frac{1}{2}}) = n ^ {\frac{1}{4}}T(n ^ {\frac{1}{4}}) + n ^ {\frac{1}{2}}$$

Substitute: $$T(n) = n ^ {\frac{1}{2}}n ^ {\frac{1}{4}}T(n ^ {\frac{1}{4}}) + n + n$$

Now, $$T(n ^ {\frac{1}{4}}) = n ^ {\frac{1}{8}}T(n ^ {\frac{1}{8}}) + n ^ {\frac{1}{4}}$$

Substitute again, $$T(n) = n ^ {\frac{1}{2}}n ^ {\frac{1}{4}}n ^ {\frac{1}{8}}T(n ^ {\frac{1}{8}}) + n + n + n$$

Notice the pattern: $$T(n) = n ^ {\frac{1}{2}}n ^ {\frac{1}{4}}n ^ {\frac{1}{8}}\cdots T(1) + n + n + \cdots + n$$

Since $T(1) = 1$ and this repeats about $\operatorname{log_2}({\operatorname{log}_2(n)})$ times:

$$T(n) \approx n^{\frac{1}{2} + \frac{1}{4} + \frac{1}{8} + \cdots + \frac{1}{2 ^ {\operatorname{log_2}({\operatorname{log}_2(n)})}}} + n \operatorname{log_2}({\operatorname{log}_2(n)})$$

$$T(n) \approx n + n \operatorname{log} \operatorname{log} n$$

Thus, your algorithm works in $O(n\operatorname{log} \operatorname{log} n)$ time.

0
On

Divide your equation by $n$ to get $$ \frac{T(n)}{n}=\frac{T(\sqrt{n})}{\sqrt{n}}+1 $$ which is easy to iterate.

So what remains is to count how many square root steps it takes to reduce the argument to below $2$, that is, determine $k$ with $2^{2^{k-1}}<n\le 2^{2^k}$.