Big-O division in $\frac{f(x)}{g(x)}$ of Taylor polynomials

82 Views Asked by At

Here p.4 bottom is a proof that proves a property of $\frac{f(x)}{g(x)}$, the quotient of two Taylor polynomials of f and g.

For two Taylor polynomials of $f(x) = \frac{f^{(k)}(x_0)x^k}{k!}+o(x^k)$ and $g(x) = \frac{g^{(k)}(x_0)x^k}{k!}+o(x^k)$ :

$$\frac{f(x)}{g(x)} = \frac{f^{(k)}(x_0)+o(1)}{g^{(k)}(x_0)+o(1)}$$

What I'm trying to understand is how does the cancellation of the $o(x^k)$ terms work out to $o(1)$.

1

There are 1 best solutions below

0
On

By definition, if $F(x) = o(x^k)$ as $x \to 0$, then $$ \frac{F(x)}{x^k} \to 0 \quad \text{ as } x \to 0 $$ But this is the same as saying $$ \frac{x^{-k}F(x)}{1} \to 0 \quad \text{ as } x \to 0, $$ i.e. $ x^{-k}F(x) = o(1) $ as $x \to 0$. Therefore, as sets, $$ x^{-k}o(x^k) = o(1) $$ This means that dividing numerator and denominator of the fraction by $x^k$ gives the expression on the right.