How would you show that the worst-case run-time complexity for this algorithm is at least O(log n), where n is the length of the relevant list?
Suppose each time we go round the while-loop counts as a single step, and that all other operations can be completed in negligible time.
The algorithm is used for determining whether or not a value x is present in a sorted list of values, v = [v(1), . . . , v(n)].
let m = 1;
while (1 <= m <= n) {
if (x == v(m)) {
exit with "x is present"
} else if (x < v(m)) {
let v = [v(1), ..., v(m-1)]; let n = m-1;
} else {
let v = [v(m+1), ..., v(n)]; let n = n - m;
};
let m = the integer part of (n+1)/2;
};
exit with "x is not present".
Thanks!
suppose that $v=[1,2,\dots, n]$ and we have $x=n$. Then the while loop will be executed until $n$ becomes $1$, so this takes $\log_2 n$ steps (because $n$ is halved in each step).