Solve following recursive relation $T(n) = 2T(\frac{n}{4}) + \log n$ without resorting to the master theorem. I've tried substitution method but it didn't work. I don't know whether there is a method using calculus for solving problems like this. According to WA, the answer is $T(n)\in \Theta(\sqrt n)$.
After substitution, I've found that $$T(n) = 2^iT(\frac{n}{2^{2i}}) + \sum_{k=0}^{i-1} 2^k\log(\frac{n}{2^{2k}})$$
and don't know how to proceed further.
Let's write $s = \frac{1}{2}\log n$, so that it suffices in order to conclude to bound the sum $$ \sum_{k=0}^{s-1} 2^k \log\frac{2^{2s}}{2^{2k}}= 2\sum_{k=0}^{s-1} (s-k)2^k\tag{1} $$ (plugging $i=s$ in the formula you obtained, as the first term $2^sT(n/2^{2s})$ then becomes $O(\sqrt{n})$. We then have, from (1), $$ \sum_{k=0}^{s-1} (s-k)2^k = \sum_{\ell=1}^s \ell \cdot 2^{s-\ell} = 2^s \sum_{\ell=1}^s \ell \cdot 2^{-\ell} \leq 2^s \sum_{\ell=1}^\infty \ell \cdot 2^{-\ell} = O(2^s) = O(\sqrt{n}) \tag{2} $$ the second-to-last equality since the sum $\sum_{\ell=1}^\infty \ell \cdot 2^{-\ell}$ converges (and therefore is an absolute constant), and the last by definition of $s$.