I tried substitution for $2^n$ or $2^{\log{n}}$ or even $2^{2^n}$ and it didn't work.
Thanks! :)
I tried substitution for $2^n$ or $2^{\log{n}}$ or even $2^{2^n}$ and it didn't work.
Thanks! :)
On
My solution is somewhat lengthy but straight forward. We have :- $$T(n)=\log n*T\left(\frac{n}{\log n}\right)+\sqrt{n}$$$$\implies T(n)=\log n\log \left(\frac{n}{\log n}\right)* T\left(\frac{n}{(\log n)^2}\right)+\sqrt{n}+\sqrt{\frac{n}{\log n}}$$ so on...... $$\implies T(n)=\log n\log\left(\frac{n}{\log n}\right)\cdot\cdot\cdot\cdot \log\left(\frac{n}{(\log n)^{k-1}}\right)T(1)+\sqrt{n}+\sqrt{\frac{n}{\log n}}+....+\sqrt{\frac{n}{(\log n)^{k-1}}}$$ where $$n=(\log n)^k\implies k=\frac{\log n}{\log\log n}\tag{1.}$$ Now since $$\log \left(\frac{n}{(\log n)^a}\right)=\log n-a\log\log n=O(\log n)$$ So $$T(n)=(\log n)^{k-1}+O(\sqrt{n})$$ The rest is on you....
I'm assuming that this is an algorithmic complexity analysis, such that you have an appropriate base case for $n < 4$, and it's enough to get the asymptotic behavior.
We can then see at least informally that $T(n)$ is $\Theta(n)$ by bounding above and below:
Lower bound: The bottom layer of the recursion spends runs $\Theta(n)$ times and at least constant time is spent on each.
Upper bound: Imagine changing the divide-and-conquer such that the algorithm splits into only $2$ subcases at each level rather than $\log n$ subcases. This can only reasonably make it slower, since we already know that $T$ grows at least linearly. But even so we still would have $$ T^*(n) = 2 T^*(n/2) + \sqrt{n} $$ and the Master Theorem then tells us that $T^*(n)$ is $\Theta(n)$. Since $T^*(n)\ge T(n)$, we can conclude that $T(n)$ is $O(n)$ too.
Making this argument rigorous is left as an exercise for the reader :-)