When I ask this question I'm usually given the response that when you see $O(\log n)$ that means the problem is continually halved.
I want to understand why is this case? Maybe add some explanations of the properties of the logarithm and how it relates to algorithms and asymptotics.
Logarithm has this property
$$\log(a^k)=k\log(a)$$ $$\log(a^{k+1})=\log(a)+k\log(a)$$
This means for $O(\log(n))$ that if you multiply the size of input for an algorithm by any constant $a$, the function will need $\log(a)$ more time (or some other resource) to execute.
If your array is three times longer, the algorithm will need $\log(3)$ more time to execute at worst.
(Notice that we are not talking about a time unit here, is it minute, hour, day, but it is a fixed unit we use. This depends on the actual hardware where an algorithm executes, but for one and the same environment it is a fixed unit.)
The same goes for space that an algorithm might need, or any other resource.