I generally visualize $\log_{2}$ of a number as an inverted binary tree, for example to know how many times 8 needs to be divided to become one I image a inverted tree of 8 leaves then, the level before that with 4 leaves until the root node, if we count the number of edges we get the correct answer in this case 3.
For nodes with power of two works fine but the process blurs when the node count is not power of two. I am not able to visualize how that process would proceed. Since, I am a visual leaner therefore I feel comfortable when I have some visual clue to follow.
I came across this thinking while reading an algorithm book where it was told that:
Since mergesort works by dividing nodes in half at each level until the number of nodes becomes 1 hence total number of times we would need to do this would be $\lceil\log_{2}{n}\rceil$
I need to know some intuitive explanation or where I am going wrong? Also, I always get confused when to take ceil or floor.
If the number $n$ satisfies $2^{k-1}<n<2^{k}$, this means that $k-1$ steps cannot be sufficient because with $k-1$ steps we can only sort $2^{k-1}$ elements.
But you can fill up the $n$ entries with $2^{k}-n$ entries greater than every element of the list to get $2^{k}$ elements.
"mergesort" can sort this extended array in $k$ steps. Then, we have also sorted the $n$ elements. So, we need $k$ steps if $2^{k-1}<n<2^{k}$.
This means $(k-1) log(2)<log(n)<k\ log(2)$ , implying $$k-1<log_2(n)<k$$
The expression you mentioned is $k$ in this case, and this is the number of steps we need.