Sum of regularly varying functions

522 Views Asked by At

I need help to proove that the sum of a regularly varying function $f$ of index $p$ and a regularly varying function $g$ of index $q$ is regularly varying of index $\max(p,q)$. A function $f$ is called regularly varying of index $p$ if $\lim\limits_{x\to\infty}\dfrac{f(tx)}{f(x)}=t^p$ for all $t>0$. In Literature it's meant to be easy but i get nowhere. There is proof for slowly ones: Sum of slowly varying functions

1

There are 1 best solutions below

0
On

The proof is straightforward if $p=q$; if $p>q$, it hinges on proving the claim that $$\lim_{x \to \infty} \frac{g(x)}{f(x)} = 0.\tag{$\star$}$$ I'll save this proof for later, since it's the part where we actually need to get our hands dirty with limit calculations; also, it's a pretty important basic fact about these functions.

Once we have that, we also know that $\frac{f(x) + g(x)}{f(x)} \to 1$, so $\frac{f(x)}{f(x)+g(x)} \to 1$, so $\frac{g(x)}{f(x) + g(x)} \to 0$. This lets us split up the limits in the calculation below, now that we know they all exist: \begin{align} \lim_{x \to \infty} \frac{f(tx)+g(tx)}{f(x)+g(x)} &= \lim_{x \to \infty} \left(\frac{f(x)}{f(x)+g(x)} \cdot \frac{f(tx)}{f(x)} + \frac{g(x)}{f(x)+g(x)} \cdot \frac{g(tx)}{g(x)}\right) \\ &= \lim_{x \to \infty} \frac{f(x)}{f(x)+g(x)} \lim_{x \to \infty} \frac{f(tx)}{f(x)} + \lim_{x \to \infty} \frac{g(x)}{f(x)+g(x)} \lim_{x\to\infty} \frac{g(tx)}{g(x)} \\ &= 1 \cdot t^p + 0 \cdot t^q = t^p. \end{align}


To prove $(\star)$, observe that $\frac{g(x)}{f(x)}$ is a regularly varying function of index $q-p$, so we need to prove an even more basic property: that if $h(x)$ is regularly varying of index $r < 0$, then $h(x) \to 0$ as $x \to \infty$.

We have $\lim_{x \to \infty} \frac{h(2x)}{h(x)} = 2^r$, where $2^r < 1$ since $r<0$; choose some $c \in (2^r, 1)$. There must be some $x_0$ such that, for all $x \ge x_0$, $\frac{h(2x)}{h(x)} < c$, so $h(2x) < c h(x)$.

We can iterate this to get $h(2^k x) < c^k h(x)$ for $x \ge x_0$, or to put it another way, $h(x) < c^k h(x/2^k)$ for $x \ge 2^k x_0$. We can always do $\lfloor \log_2 \frac{x}{x_0} \rfloor$ iterations to get $x_0 \le x/2^k \le 2x_0$. Let $H = \max_{x_0 \le x \le 2x_0} h(x)$; then $$h(x) < H c^{\lfloor \log_2 \frac{x}{x_0} \rfloor} \text{ for } x \ge x_0$$ and since the right-hand side goes to $0$ as $x \to \infty$, we also get $\lim_{x \to \infty} h(x) = 0$ by the squeeze theorem, as desired.