The answer key states this algorithm is O(log n). I was expecting the recurrence relationship to be T(n) = 2T(n/2) + 2, therefore, the answer key renders my hypothesis as false.
Question: How is this algorithm O(log n)?
Edit: Maybe Answered? Supposedly, through the use of temporary variables (e.g. call = recExpo(x, n/2)), it brings down the time complexity to O(log n).
I have developed an algorithm in pseudocode. It is invoked when called by the helper function. If the exponent (denoted by the variable n) is even, the function calls itself twice. Whereas, if the exponent is odd, the function calls itself twice and is multiplied by the base (denoted by the variable x).
Here is the pseudocode:
recExpo { if (n == 0) { return 1; }
if (n == 1) { return x; }
if (n % 2 == 0) {
return recExpo(x,n/2) * recExpo(x,n/2);
} else {
return recExpo(x,n/2) * recExpo(x,n/2) * x;
}
recExpoHelp { return recExpo(x,n); }
In a method such as BinarySearch, the problem is subdivided into two, and the appropriate problem of size n/2 (left or right) is then further subdivided. And in each division of the problem, there is one constant time comparison being made which results in a recurrence relation of T(n) = T(n/2) + 1. Hence, the time complexity is O(log n).
In a method such as MergeSort, the problem n is split into two problems. Therefore, we have two recursive calls that divide the problem into two with n amount of comparisons each time we combine which results in a recurrence relation of T(n) = 2T(n/2) + n. Hence, the time complexity is O(n log n).
In this algorithm, we divide the problem (n) into a size of (n/2) twice (with two recursive calls) and make one comparison for each call. My expectation would be that this would result with a recurrence relationship such as T(n) = 2T(n/2) + 2. The answer key states the time complexity is O(log n) which would render my recurrence relation false. However, I cannot come up with an explanation that would validate the algorithm to be O(log n).
As mentioned by ConMan, there is a use of dynamic programming which results with essentially one recursive call as the value is stored in the variable. Therefore, the time complexity would be O(log n).
Hence, the solution would look like this:
recExpo { if (n == 0) { return 1; } call = recExpo(x,n/2); if (n % 2 == 0) { return call * call; } else { return call * call * x; }
recExpoHelp { return recExpo(x,n); }