With $O(\log n)$ we have $f(n/2) + O(1)$ which is understandable because the $\log n$ halve the problem each time which is represented by $n/2$.
With $O(n)$ we have $2f(n/2) + O(1)$ which is again understandable because we have $2f(n/2)$ which should give us $n$.
However with $O(n \log n)$ we have $2f(n/2) + O(n)$ which is weird because it seems like we would have $2n$.
Either my thinking is completely flawed and therefore I would like to be corrected or there is an explanation for it that doesn't cross my mind.
Consider $g(n) = \frac{f(n)}{n}$ and that satisfies $g(n) = g(n/2) + O(1)$ which gives $f(n) = O(n \log n)$
An intuitive way to look at the recurrence is that if you wrote an algorithm which touched every element of the array and then recursed on the two halves, you would end up touching each element $\approx \log n$ times. So $O(n \log n)$ touches.