function Do-something(a,b)
if b = 0 then
return 1
else if b is even then
x <- Do-something(a,b/2)
return x * x
else if b is odd then
x <- Do-something(a,(b-1)/2)
return x * x * a
end if
end function
This is a recursive function that will eventually return $a^b$ Each time it starts again B is halved (if it is odd is essentially halved and rounded down from the -1). So I believe it gets sent through recursion $\left\lfloor log_2(b)+2 \right\rfloor$ times. So in big-O notation would it be something like O($\left\lfloor log_2(b)+2 \right\rfloor$) or is my thinking off base?
Yes, this algorithm for calculating powers runs in time $O(\log b)$.
There is no need to add the $+2$ or to specify the base of the logarithm or use the floor since the big-$O$ behavior is the same.
@Danielv notes correctly that for integer arithmetic the multiplication time may be significant. This algorithm is a standard one for modular arithmetic, where that is not the governing bottleneck.