O-notation: composing functions

678 Views Asked by At

Big-oh and little-oh notation make things much simpler, and there are convenient rules for combining functions, for example, the ones mentioned here.

One rule conspicuously missing from the above list is a rule for function composition. If $f = O(\mu(x))$ at $a$, and $g=O(\nu(y))$ at $f(a)$, then what do we know about $g \circ f$ at $a$?

Some things which lead me to believe such a rule might exist, under certain conditions:

  • Suppose $f$ and $g$ are $n$ times differentiable functions. It is a theorem in calculus that if $p$ is the Taylor polynomial of degree $n$ for $f$ at $a$, and $q$ is the Taylor polynomial of degree $n$ for $g$ at $f(a)$. Then $q \circ p$, truncated at degree $n$, is the Taylor polynomial for $g \circ f$ at $a$. In other words, if $$f= p - o((x-a)^n)\quad \text{and} \quad g = q + o((y-f(a))^n),\\ \text{then}\quad g \circ f = q \circ p + o((x-a)^n).$$
  • In the simple case of polynomials, there is a very convenient composition rule: Suppose $p(x)-p(a)=O((x-a)^n)$ at $a$, and $q(y) = O((y-p(a))^m)$ at $p(a)$. Then $q \circ p$ is $O((x-a)^{mn})$ at $a$.
2

There are 2 best solutions below

1
On BEST ANSWER

By the assumptions made,

$$\limsup_{x\to a} \frac{|f(x)|}{|\mu(x)|}<\infty\quad \text{ and }\quad \limsup_{x\to a} \frac{|g(f(x))|}{|\nu(f(x))|}<\infty \tag1$$ If the function $\nu$ is doubling, meaning that there is $C$ such that $$ |s|\le 2|t|\implies |\nu(s)|\le C|\nu(t)| \tag2$$ then we can conclude from the first part of (1) that $$\limsup_{x\to a} \frac{|\nu(f(x))|}{|\nu(\mu(x))|}<\infty$$ and combining this with the second part of (1), $$\limsup_{x\to a} \frac{|g(f(x))|}{|\nu(\mu(x))|}<\infty \tag3$$ which is what one might have expected: $g\circ f=O(\nu\circ \mu)$.

Non-doubling case

Without (2) we don't get an explicit asymptotic like (3) because the constant implicit in the first part of (1) actually matters. For example, if $f(x)=O(x)$ as $x\to 0$ and $g(x)=O(\exp(-1/x^2))$ as $x\to 0$, the asymptotic behavior of $g\circ f$ depends quite a bit on the implicit constant in $f(x)=O(x)$. We can state that $$g(x) = O(\exp(-C/x^2)) \quad \text{for some } \ C>0$$ which is less precise that having $O(\cdot )$ with a concrete comparison function in it.

Remark

Constant $2$ in the doubling condition (2) could be replaced by any constant greater than $1$; the meaning of the condition does not change. Doubling condition includes all polynomials (as in your example), but does not include the functions growing or decaying at super-polynomial rate.

0
On

The argument about the Taylor polynomial is sound...but not every function's Taylor polynomial converges to the function, so it may have no impact on the functions themselves. For instance, for $$ f(x) = \begin{cases} 1 - e^{-\frac{1}{x^2}} & x \ne 0 \\ 0 & x = 0\end{cases} $$ the Taylor series converges everywhere (it's the zero series!), but converges to $f$ only at $x = 0$. So knowing something about the degree of the Taylor series doesn't tell you much about $f$.

For nice enough functions, with series representations that converge everywhere, there might be something reasonable to say...but if $f$ and $g$ have everywhere convergent series, is it obvious that $f\circ g$ does as well? (It's not obvious to me!)