I'm trying to work out the big O of a searching algorithm which works recursively dividing lists continually until the value is found. The question is as follows:
How many times can you divide a list of length n into a sublists of length √n. Then dividing those lists into lists of length fourth root of n then those lists to length eighth root n and so on and so forth.
I think it should be $$ \sum_{k=1}^\infty [n^{2^{-k}} - 1], $$ where n is the length of the list.
It's been through Wolfram Alpha but it couldn't work out a limit but just that it converges. Can anyone with some knowledge of how these kind of series work out a simpler expression for it?
edit: The idea was basically just born out of quarantine boredness, it would work out on an ordered list. Essentially mixing the idea of a jump search and a binary search to divide the list into sub lists equal to the square root of its length. And then working recursively dividing those sublists further the same way, until it's either found the item it looks for or has a list of length 1 (where the item isn't present) or 0. Also thank you to Omnomnomnom for sorting out the maths formatting shizzle.
Assuming we handle lists of length $2$ without a recursion step, lists of length $2^{2^k}$ would be handled in $k$ recursion steps: by taking square roots, we'd go $$ 2^{2^k} \to 2^{2^{k-1}} \to 2^{2^{k-2}} \to \dots \to 2^{2^1} \to 2^{2^0}. $$ If $n = 2^{2^k}$, then $k = \log_2 \log_2 n$ is the approximate number of recursive steps we'd have to take (though we need to consider carefully how we round $\sqrt n$ when it's not an integer).