Is the existence of a doubling time characteristic for exponential growth

96 Views Asked by At

In some article about the Corona Crisis I have read that, since the number of known infections now growth linearly, the doubling time is not that important anymore.
Clearly, speaking of the doubling time is nonsense in the case of linear growth because there is no such thing. However, this lead me to the question whether doubling is in some sense characteristic for exponential growth.
Is there some condition that ensures that a function which doubles at a fixed rate is actually an exponential function? Put mathematically:
Suppose $f$ is a function from $\mathbb{R}$ to $\mathbb{R}$, and suppose $t > 0$ is such that for every $x \in \mathbb{R}$ we have $$f(x + t) = 2 \cdot f(x).$$ Under which additional condition is it true that there are $A, \alpha$ such that $f(x) = A \cdot e^{\alpha t}$?

My thoughts:
Continuity or differentiability are not sufficient. Because e. g. any function $f$ which is continuous on $\left[ 0, t\right]$ and which satisfies $f(0) = f(t)$ gives rise to a function which satisfies $f(x + t) = 2 \cdot f(x)$ by simply putting $f(b) = 2^n f(a)$ if $a \in \left[0, t \right]$ and $b = a + n \cdot t$.
Maybe something like convexity or logarithmic convexitiy is sufficient but I don't see how to prove it.
Thanks

1

There are 1 best solutions below

0
On

I finally found an (at least partial) answer to my question. It turns out that it is indeed sufficient that $f$ is logarithmic convex. I'm going to provide a proof in case someone is interested.

So the statement is as follows:
If the function $f : \mathbb{R} \to \mathbb{R}$ has the following properties

  1. There is some $T>0$ such that $f(t+T) = 2 \cdot f(t)$ for every $t \in \mathbb{R}$,
  2. $f$ is logarithmic convex.

Then $$ f= A\cdot e^{\alpha t}$$ for some $A, \alpha \in \mathbb{R}.$

To proof this one can mimic the proof of the Bohr-Mollerup Theorem given in Conway's "Functions of One Complex Variable".
Proof:
First note that the functional equation $f(t + T) = 2 \cdot f (t)$ implies that $f(t + nT) = 2^n \cdot f(t)$ for every $t \in \mathbb{R}$ and every $n \in \mathbb{N}$. In particular, the function $f$ is uniquelly determined by it's values in the "fundamental interval" $\left[0, T \right)$.
Now if $t \in \left[ 0, T\right)$ and $n \in \mathbb{N}$, then the convexity of $\log \circ f$ implies $$ \frac{\log ( f (nT) )- \log ( f ((n-1) T)) }{nT - (n-1) T} \leq \frac{\log ( f(t + n T)) - \log ( f(nT))}{t+ nT -nT} \leq \frac{\log ( f ((n+1)T)) - \log ( f (nT))}{(n+1)T - nT}. $$ The left and the right side of the above inequalities actually do not depend on $n$. Indeed, we have $$ \frac{\log ( f (nT)) - \log (f ((n-1) T)) }{nT - (n-1) T} = \frac{\log (f(0)\cdot 2^n) - \log (f(0)\cdot 2^{n-1})}{T} = \frac{1}{T} \left[ \log ( f (0)) + n \log 2 - \log ( f(0)) - (n-1) \log 2\right] = \frac{\log 2}{T} $$ and likewise $$ \frac{\log ( f ((n+1)T)) - \log ( f (nT))}{(n+1)T - nT} = \frac{\log2 }{T}. $$ It follows that $$ \frac{\log ( f(t + n T)) - \log ( f(nT))}{t+ nT -nT} = \frac{\log2 }{T}. $$ This can be simplified to $\log(f(t)) = \frac{\log 2 t}{T} + \log (f(t))$. Now exponentiation yields $$f(t) = A \cdot e^{\alpha t}$$ with $A =f(0)$ and $\alpha = \frac{\log2}{T}$.