Currently, I am studying kernel regressions. The problem I have is in understanding the relation between big-o and little-o. The following changes (in two different papers) occur:
$ \frac{h^{2(j-1)}}{N} \int_0^{\infty}(K(x))^2 \cdot x^{2j} f(hx)dx = O\left( \frac{h^{j2-1}}{N}\right) = o\left( \left( \frac{h^j}{hN^{1/2}}\right)^2\right) = o((h^j)^2) $
and
$ Bias (\hat{f}(x)) = \mathbb{E}[\hat{f}(x)] -f(x) = \frac{1}{v!}f^{(v)}(x)h^vk_v(k)+o(h^v)$
Then taking $v=2$ we have:
$ Bias (\hat{f}(x)) = \frac{1}{2!}f^{(2)}(x)h^2k_2(k)+O(h^4)$
The last two equations follow from a Taylor expansion. Hence, the $f^{(v)}$ denotes the order of the derivative. Now my problem is that I do not see how little-o becomes big-o and how big-o becomes little-o.
Sorry, that I did not dive in Taylor part, but for $O,o$ we have following general properties: $$o(g) \subset O(g)$$ $$O(x^n) \subset o(x^{n+1}), x \to 0.$$ Some authors prefer to write equality, which exact meaning, though, is subset.