So apparently:
$T(n) = 2T(n/2) + n / \log n$ doesn't work with the Master Theorem because of the log term.
But then:
$T(n) = 4T(n/2) + n / \log n$ is $\Theta(n^2)$ even though it's still the same log term.
Why is this the case?
So apparently:
$T(n) = 2T(n/2) + n / \log n$ doesn't work with the Master Theorem because of the log term.
But then:
$T(n) = 4T(n/2) + n / \log n$ is $\Theta(n^2)$ even though it's still the same log term.
Why is this the case?
The Master Theorem lets one handle (many) recurrence relations of the form $$ T(n) = aT\left(\frac{n}{b}\right) + f(n) $$ where $a\geq 1$ and $b > 1$. The key is to compare the function $f(n)$ to the quantity $\gamma\stackrel{\rm def}{=} \log_b a$.
In your two examples, $f(n)=\frac{n}{\log n}$; in the first case you have $a=b=2$ and $\gamma=1$. Therefore, $f(n)$ is not $O(n^c)$ for any $c< \gamma$, nor $\Theta(n^\gamma\log^k n)$ for any $k\geq 0$, nor $\Omega(n^c)$ for any $c>\gamma$: none of the 3 cases of the Master Theorem apply (although we are "pretty close" to the first). We cannot conclude, because the $\log n$ factor blurs the line between what part of the recurrence dominates: the overhead $f(n)$ at each step, or the branching factor (from one step to $a$ smaller substeps).
But in the second case, $a=4$, $b=2$, and thus $\gamma=2$. In this case, we do have $f(n) = O(n^c)$ for some $c < \gamma$ (taking $c=1$ works), and the first case of the Master theorem applies: we get $T(n) = \Theta(n^\gamma) = \Theta(n^2)$.
In a nutshell: the difference between the two cases you gave is not in $f(n)$, it is in the rest of the recurrence (the constant $a$), which allows the branching in the recurrence to dominate the complexity (each step gives rise to 4 substeps, not 2 as before, so they add up much more quickly.)