I think I finally managed to fully understand the master theorem but there's one thing left in the second clause (I'm following here: http://www.eecis.udel.edu/~saunders/courses/621/11s/notes/notes/Master-theorem.html)
$$ T(n) = O(n^dlog(n)), \\if\\ e = d $$
where does the $log(n)$ term come from? It seems like an "adjust value" since the phrase below recites
Paraphrase: The exponent of n is whichever dominates: the non recursive work (nd) or the total work at the deepest lever of recursion, alogb(n). But when these two are equal there is a log(n) factor.
So my question is: why was that log term put there?
Your lecture notes imply that $T(n)$ can be described by the formula $$n^d \sum_{i=1}^{\log_b n} \left( \frac{a}{b^d} \right)^i.$$ But consider this: $$\log(a/b^d) = \log a - d \log b =\left( \frac{\log a}{\log b} - d\right) \log b = ( e - d) \log b .$$ This is positive if $e > d$, negative if $e < d$, and zero if $e = d$. So if $e = d$, then $a/b^d = 1$, and the sum is just $log_b n$. That is, when $e = d$, $$n^d \sum_{i=1}^{\log_b n} \left( \frac{a}{b^d} \right)^i = n^d log_b n = \frac{1}{\log b} n^d \log n = O(n^d \log n).$$ So why doesn't this term show up when $e \ne d$? In the case where $e < d$, we can see that $\frac{a}{b^d} < 1$, and the sum asymptotically approaches a constant, so it does not figure in the $O()$ notation. On the other hand, if $e > d$, the sum grows exponentially, in fact it is $O(n^{e-d})$, and multiplied by $n^d$ this gives us $O(n^e)$.
In other words, the reason why the term of $\log n$ shows up only when $e = d$ is because that's the only case in which the ratio of terms in the sum is $1$, and therefore the only case in which the sum grows at the same rate as the number of terms. In the other cases it grows either much faster or much slower.