http://www.stat.cmu.edu/~cshalizi/uADA/13/lectures/app-b.pdf
The corresponding little-o means “is ultimately smaller than”: $f (n) = o(1)$ means that $f (n)/c \to 0$ for any constant $c$.
Recursively, $g (n) = o(f (n))$ means $g (n)/ f (n) = o(1)$, or $g (n)/ f (n) \to 0$
Notice that $f (n) = o(1)$ means $f (n)/c \to 0$ for any constant $c$.
But in the second case there's no $c$! Where did it 'go'? Shouldn't it be like
$g (n) = o(f (n))$ means $g (n)/ (f (n)) = o(1)$, or $g (n)/ (f (n)*c) \to 0$?
There is a lot that is wrong in that PDF. The definition of big-$O$ is flat out wrong, for example.
The definition of little-$o$ is not wrong, per se, but it is certainly pointless. It seems like the writer is trying to create a parallel with the (broken) definition of big-$O$, but the constant $c$ is completely unnecessary for little-$o$.
I see from Wikipedia, though, this text, under "Used in computer science:"
The PDF is slightly closer to this informal "tighter" usage of big-$O$, although even here, the PDF is giving a stronger definition.