For which logarithm bases is this sum finite?

387 Views Asked by At

Lets have a recursive sequence $a_0 = 1$, $a_{n+1}=\log_{b} (1+a_n)$

and corresponding sum $\sum_{n=0}^\infty (a_n)$. For which logarithm bases $b>2$ ( possibly for $b>e$) is the sum finite, if for any?

I did some calculations comparing this sequence to a harmonic sequence for b=e, and my result was, that for natural logarithm the sum is infinite. I used a trick expressing the harmonic sequence recursively and compared which transformation lowers the number more rapidly. For natural logarithm, harmonic sequence wins, so individual numbers in this sequence should be somewhat larger and therefore the sum should be bigger than sum of harmonic sequence (so they are both infinite). However, for larger bases, harmonic sequence loses, and therefore I can't tell, whether it is infinite or not and no other tricks are on my mind. Anybody can help?

2

There are 2 best solutions below

1
On BEST ANSWER

For base 2, you have a sum of 1's which diverges. For smaller bases, use that for every recursion,

$$ \log_b(x) = \log_2(x) \cdot \log_b(2) > \log_2(x) $$

so the terms in your sum get even larger.

So for $b<2$, the sum diverges.

If it is known (as stated by the author) that the series diverges for $b=e$, then by the same argument it will also diverge for $b<e$.

It remains to prove that for $b> e$, the sum converges. We will use the well known fact (for base e) that $\log(1+x) \leq x$. We have

$$ a_{n+1} = \log_b(1 + a_n) = \log_e(1 + a_n) \cdot \log_b(e) \leq a_n \cdot \log_b(e) $$

so

$$ \sum_{n=0}^\infty a_n \leq \sum_{n=0}^\infty (\log_b e )^n $$

which converges (geometric series) exactly for $\log_b e <1$ or $b > e$.

Hence, in total, the series converges for $b > e$.

0
On

This is a simple proof that $\sum a_n$ diverges for $b=e.$ For $0<x< 1$ we have $0<x-x^2/2< \ln (1+x)< x\leq 1$ because $\ln(1+x)=x-x^2/2+x^3/3-...$

So $1\geq a_n>a_{n+1}>a_n-a_n^2/2=a_n(1-a_n/2).$ We have $$a_{n+m}>a_n(1-a_n/2)^m$$ for all $m>0$ and all $n$ by induction: If $a_{n+m}>a_n(1-a_n/2)^m$ then $$a_{n+m+1}>a_{n+m}(1-a_{n+m}/2)>a_n(1-a_n/2)^m(1-a_{n+m})>$$ $$>a_n(1-a_n/2)^m(1-a_n)=a_n(1-a_n/2)^{m+1}.$$ Therefore, since $a_n>0,$ we have $$\sum_{m=0}^{\infty}a_{n+m}>\sum_{m=0}^{\infty}a_n(1-a_n/2)^m=2.$$ Since this holds for every $n,$ the Cauchy criterion is not met.