Ways to determine how fast a sequence diverge/converge

850 Views Asked by At

I have recently read about divergent/convergent sequences, and I wonder what methods are normally used to show how fast a sequence grow. Can I simply find the difference between two adjacent terms? And how can I do this for sequences like harmonic series? Thank you!

Here is an example, for a sequence that can be written as an iterated function $$ x_{n+1}=\frac{1}{\sqrt{x_n^2+1}} $$ How can I determine how fast this series grow?

1

There are 1 best solutions below

0
On

You might want to Google rate of convergence and asymptotic growth. Many books on numerical analysis also cover this to decide how useful certain algorithms are. A couple of basic ways these are defined are:

Linear convergence $\iff \displaystyle \lim_{n \rightarrow \infty} \frac{|x_{n+1} - L |}{|x_n - L|} \in (0,1)$

Super-linear and $k$-th order convergence $\iff \displaystyle 0 < \lim_{n \rightarrow \infty} \frac{|x_{n+1} - L |}{|x_n - L|^k} < \infty, \lim_{n \rightarrow \infty} \frac{|x_{n+1} - L |}{|x_n - L|} = 0$

You are interested in the growth rate of the harmonic series, which does not converge (a very elementary proof here), so there is no $L$ to check speed of convergence as $\infty$ is not allowed here. Assuming by "speed of divergence" you mean the rate of growth (how fast it blows up) there is the famous result that: $$\sum_{k=1}^{n} \frac{1}{k} \approx \log n + \gamma, \hspace{2mm} \gamma \approx 0.5772 $$

And so the series grows as fast as the natural logarithm, since we don't really care about constants. This can be proved several ways, one which is a direct application of Euler's summation formula and is on page 480 here in Knuth's excellent book, but it is not easy. A more basic way to do this with less accuracy but is good enough for just comparing the growth is using calculus answered here, the following picture is useful.

Any function's growth rate can be expressed in "O notation", the simplest is big O, which states that $f(x) \in \mathcal{O}(g(x)) \iff \exists C \in \mathbb{R}_+$ $s.t. |f(x)| \leq C|g(x)|$ $\displaystyle \forall x > x_0 \iff \lim_{x \rightarrow \infty} \frac{f(x)}{g(x)} < \infty$

For the harmonic series example, if we pretend our approximation is exact, then $|\log n + \gamma | < 2|\log n |$ $\forall x > e^{\gamma}$, and $\displaystyle \lim_{x \rightarrow \infty} \frac{\log(x) + \gamma}{\log(x)} = 1 < \infty$ so $\sum_{k=1}^{n} \frac{1}{k} \in \mathcal{O}(\log n)$ which quantifies your "rate of divergence" as less than or equal to the logarithm (in fact they are equal, the symbol $\Theta$ is used if $f(x) \in \mathcal{O}(g(x))$ and $g(x) \in \mathcal{O}(f(x))$, try to show the other way round for the example above).