limit of $a_n$ when n to infinity. $a_1=\sqrt{k}$ and $a_n = \sqrt{k}^{a_{n-1}}$ . $0<k<1$.

286 Views Asked by At

I find a question on quora: limit of a sequence.

Generalized Case 1

When you generalize this question like: \begin{align} a_1 &= \sqrt{k} \\ a_n &= \sqrt{k}^{a_{n-1}} \end{align} where $k =2$, Then: $$\lim_{n \to \infty}a_n =2$$

But what if you change $k$:

  1. Well, when $k =3$, it will be $\infty$,(am i wrong about this?). So I am wondering that it will be a number $\alpha$, when $2<k<\alpha$, limit of $a_n$ is finite. Dose this assumption right? if so, how can I find this number, if not, why?
  2. And what if $0<k<1$, what is the limit of $a_n$? or dose it exist? I do do some research about it. It is bounded, but not monotone. I plot this sequence which $k = \frac{1}{2}$ in mathematica, it seem convergence.

Generalized Case 2

If you treat original quora question as a special case of : \begin{align} a_1 &= k^{\frac{1}{k}} \\ a_n &= \left(k^{\frac{1}{k}} \right)^{a_{n-1}} \end{align} where $k=2$, Then $\lim_{n \to \infty}a_n =2$ also is true.

Now, consider $k>1$, Then how do I calculate:$$\lim_{n\to \infty}a_n$$

Thanks in advance.

1

There are 1 best solutions below

0
On

We show that $a_{2n}$ is a decreasing sequence and $a_{2n-1}$ an increasing sequence. In fact, we show that:

$$0 < a_1<a_3< \dots < a_{2n-1} < \dots < a_{2n} < \dots < a_4 < a_2 < 1$$

Base case: Since $0<k<1$, we have that $0<k<\sqrt{k}<1$. We have that $a_2={\left(\sqrt{k}\right)}^{a_1}$, and since $0<a_1=\sqrt{k}<1$, it follows that $0<a_1<a_2<1$.

In truth, for the induction step we need to show the claim up until $a_4$, but I hope that a posteriori it becomes clear how the reasoning on the inductive step itself can be applied to show these cases.

Induction step: We now divide into two cases.

First, suppose that the induction claim holds up until $a_{2n-1}$. We show that $a_{2n}$ satisfies the induction claim. Indeed, we have that $a_{2n}={\left(\sqrt{k}\right)}^{a_{2n-1}}$ and $a_{2n-1}={\left(\sqrt{k}\right)}^{a_{2n-2}}$. Since $a_{2n-1}<a_{2n-2}$ and $0<\sqrt{k}<1$, it follows that $a_{2n}>a_{2n-1}$.

Indeed, we have that \begin{aligned} a_{2n}>a_{2n-1}&\Longleftrightarrow{\left(\sqrt{k}\right)}^{a_{2n-1}}>{\left(\sqrt{k}\right)}^{a_{2n-2}}\\ &\Longleftrightarrow a_{2n-1}\cdot\ln\left(\sqrt{k}\right)>a_{2n-2}\cdot\ln\left(\sqrt{k}\right)\\ &\Longleftrightarrow a_{2n-1}<a_{2n-2} \end{aligned} where the change in inequality direction is justified by $\ln\left(\sqrt{k}\right)<1$, because $0<\sqrt{k}<1$, and the last inequality is assumed by the induction step.

Similarly, since $a_{2n}={\left(\sqrt{k}\right)}^{a_{2n-1}}$ and $a_{2n-2}={\left(\sqrt{k}\right)}^{a_{2n-3}}$, it follows from $a_{2n-3}<a_{2n-1}$ and $0<\sqrt{k}<1$ that $a_{2n}<a_{2n-2}$, which completes this case.

The case when the induction holds up until $a_{2n}$ can be treated in much the same manner and will be omitted. We thus consider the claim proved.


With this in mind, it follows that both $a_{2n}$ and $a_{2n-1}$ are bounded strictly monotone sequences, and hence convergent.

Now, let $a_{2n-1}\to L$ and $a_{2n} \to L'$. I did spend quite some time trying to prove that $L=L'$ and some initial tests seemed to confirm this, but after trying some more I no longer suspect this is true. Indeed, we have that

\begin{aligned} &a_{2n} = {\left(\sqrt{k}\right)}^{{\left(\sqrt{k}\right)}^{a_{2n-2}}}\\ &a_{2n-1} = {\left(\sqrt{k}\right)}^{{\left(\sqrt{k}\right)}^{a_{2n-3}}} \end{aligned}

so as $n \to \infty$ we find that

\begin{aligned} &L = {\left(\sqrt{k}\right)}^{{\left(\sqrt{k}\right)}^{L}}\\ &L' = {\left(\sqrt{k}\right)}^{{\left(\sqrt{k}\right)}^{L'}} \end{aligned}

Letting $f(x)=a^{a^x}-x$ with $0<a=\sqrt{k}<1$, we have that $L,L'$ are roots of $f$ in $(0,1)$. Taking, say, $k=\frac{1}{250}$ or smaller, we can see that $f$ has multiple roots (I think three) in the interval, and running the sequence for these values it does seem that each subsequence is converging to a different one of these roots.

In fact, it appears that $L$ is the root closest to $0$ and $L'$ is the root closest to $1$; it seems no subsequence converges to the root in the middle.

Moreover, because $a_{2n} = {\left(\sqrt{k}\right)}^{a_{2n-1}}$, as $n\to \infty$ we find that $L'={\left(\sqrt{k}\right)}^{L}$, and similarly $L={\left(\sqrt{k}\right)}^{L'}$. It follows that $\ln\left(L'\right)=L\cdot \ln\left(\sqrt{k}\right)$ and $\ln\left(L\right)=L'\cdot \ln\left(\sqrt{k}\right)$, so that

$$\frac{\ln\left(L'\right)}{L}=\frac{\ln\left(L\right)}{L'},$$

or $L'\cdot\ln\left(L'\right) = L\cdot\ln\left(L\right)$, or yet ${L'}^{L'}=L^L$.

It's late here, and although I am very intrigued I must sleep. I hope to look more into it tomorrow; it would be nice to confirm or disporve these observations. A very interesting sequence indeed.


EDIT: As discussed in the comments with Did, let $g(x)={\left(\sqrt{k}\right)}^x$. There is a unique solution to $g(x)=x$ in $(0,1)$, given by ${x_0}^{\frac{1}{x_0}}= \sqrt{k}$. Its derivative is $$g'(x_0) = \ln\left(\sqrt{k}\right) \cdot {\left(\sqrt{k}\right)}^{x_0} = \ln(x_0)$$

Moreover,when $\sqrt{k} = e^{-e}$ we have $x_0=e^{-1}$ by inspection, so in this case $g'(x_0)=-1$. Since $x^{\frac1x}$ is strictly increasing in $(0,1)$, it follows that when $\sqrt{k} > e^{-e}$ we have $g'(x_0)>-1$ and when $\sqrt{k} < e^{-e}$ we have $g'(x_0)<-1$. This means that $x_0$ is an attractor in the first case and a repeller in the latter case (use that $g$ is convex!); thus the critical value is $k_c=e^{-2e}$.

Indeed, letting $a=\sqrt{k}$ and writing $h(x)=a^{a^x}$, we know that $L$ and $L'$ are solutions to $h(x)=x$ in $(0,1)$. Moreover, we know there is at least one solution: the solution $x_0$ to $g(x)=x$.

Claim: When $k<k_c$, there are two other solutions $x_1,x_2$ with $0<x_1<x_0<x_2<1$; when $k>k_c$, there are no other solutions.

Consider $f(x)=h(x)-x$, we will look for roots in $(0,1)$. Notice that $f(0)=a>0$ and $f(1)=a^a-1<0$. Differentiating with respect to $x$ yields:

\begin{equation}\tag{1}\label{f’}f'(x)=a^{a^x}\cdot a^x \cdot {\ln(a)}^2 – 1\end{equation}

First, we’ll need a lemma; it says that the graph of $f$ always crosses the $x$-axis near roots.


Lemma: Let $0<k<1$, $k \neq k_c$ and $x \in (0,1)$ be a root of $f$. Then $f’(x) \neq 0$.

Case 1: $k>k_c$

In this case, we have that $\eqref{f’}$ becomes

$$x\cdot \ln(x)\cdot\ln(a)-1$$

Now $k>k_c$ implies $a> e^{-e}$, so $|\ln(a)|<e$. On the other hand, $x\cdot\ln(x)$ has derivative $1+\ln(x)$ and attains its minimum value at $x=e^{-1}$, with value $-e^{-1}$. Hence, $|x\cdot\ln(x)|<e^{-1}$, and thus $| x\cdot \ln(x)\cdot\ln(a)|<1$.

It follows that $f’(x)<0$, so this case is done.

Case 2: $k<k_c$

Taking logarithms in $\eqref{f’}$, we find that whenever $x$ is a critical point of $f$ it holds that

\begin{equation}\tag{2}\label{f’2} a^x+x=2\frac{\ln(-\ln(a))}{-\ln(a)} \end{equation} Rewrite $\eqref{f’2}$ as $ a^x=2\frac{\ln(-\ln(a))}{-\ln(a)} -x$ and raise $a$ to each of its sides. This yields

$$a^{a^x}=a^{-2\frac{\ln(-\ln(a))}{\ln(a)}}\cdot \frac{1}{a^x}=\frac{\frac{1}{{\big(\ln(a)\big)}^2}}{2\frac{\ln(-\ln(a))}{-\ln(a)}-x}$$

In the last equality, we used that $\frac{\ln(-\ln(a))}{\ln(a)}=\log_a(-\ln(a))$ (and the squaring does away with the minus sign). Now, suppose that, in addition to being a root of $f$ (that is, $a^{a^x}=x$), $x$ were also a critical point. We will derive a contradiction.

Indeed, if our supposition were true we’d have that

$$x=\frac{\frac{1}{{\big(\ln(a)\big)}^2}}{2\frac{\ln(-\ln(a))}{-\ln(a)}-x}$$

which may be rearranged to

$$x^2+2\frac{\ln(-\ln(a))}{ \ln(a)}\cdot x+\frac{1}{{\big(\ln(a)\big)}^2}=0$$

Solving for $x$ yields

$$x=-\frac{1}{\ln(a)}\cdot\left( \ln(-\ln(a))\pm \sqrt{{\ln(-\ln(a))}^2-1}\right)$$

We now substitute these values back into $\eqref{f’2}$, and analyze the results. After some rearranging we get that

$$\frac{-1}{\ln(a)}\left(e^{\mp \sqrt{{\ln(-\ln(a))}^2-1}} \pm \sqrt{{\ln(-\ln(a))}^2-1} - \ln(-\ln(a))\right)=0$$

Clearly, for the above to be true, the expression in parentheses must be $0$. Since $k<k_c$, it holds that $a< e^{-e}$ so $\ln(-\ln(a))>1$. We investigate the expression in parentheses above using the substitution $u=\sqrt{{\ln(-\ln(a))}^2-1}$ (so $u>0$). It becomes

$$e^{\mp u} \pm u - \sqrt{u^2+1}$$

We will show that neither of these two expressions can never be $0$ for $u>0$, which concludes the proof for this case. Indeed, if they were $0$ we’d have

\begin{align} &e^{\mp u}\pm u =\sqrt{u^2+1} \\ \Longrightarrow \,\, &e^{\mp 2u} \pm 2ue^{\mp u} + u^2 = u^2 +1\\ \Longrightarrow \, \, &e^{\mp 2u} \pm 2ue^{\mp u} -1 = 0\tag{3}\label{subsu} \end{align}

This last equation is true when $u=0$. However, the derivative of the LHS with respect to $u$ is

$$\mp 2e^{\mp 2u} \pm 2e^{\mp u} \mp 2ue^{\mp u} = \mp 2e^{\mp u} \cdot \left( e^{\mp u} -1 + u \right)$$

The term before the parentheses has constant sign (negative for the upper sign, positive for the lower sign).

The term inside the parentheses is $0$ when $u=0$, and has derivative $1 \mp e^{\mp u}$, which is positive for all $u>0$. It follows that the term inside parentheses is strictly increasing for $u>0$, and so is positive for all $u>0$. Thus, the expression as a whole, the derivative of the LHS of $\eqref{subsu}$, is never zero and has constant sign. In other words, the LHS of $\eqref{subsu}$ is strictly monotone in $u$ for all $u>0$, so the equation cannot be satisfied.

With this, the lemma is proved.


With the lemma out of the way, we turn back to proving the claim. We know that $f(0)>0$ and $f(1)<0$. By the lemma, every change of sign in $f$ corresponds to a root of $f$, so $f$ has an odd number of roots.

Moreover, between any two roots of $f$, there must be a critical point; in other words, if $f$ has $2n+1$ roots, it must have at least $2n$ critical points. We will study the number of critical points of $f$ for each case and employ this observation to prove the claim.

Case 1: $k>k_c$

Consider $\eqref{f’2}$. Let $y(x)=a^x+x$. Its derivative is $y’(x)=a^x\cdot\ln(a)+1$, and its only critical point, a global minimum, is at $x=\frac{\ln(-\ln(a))}{-\ln(a)}$. Hence, the minimal value of $y$ is $\frac{\ln(-\ln(a))+1}{-\ln(a)}$.

For case 1, $\ln(-\ln(a))< 1$, and hence the RHS of $\eqref{f’2}$ is less than the minimum of its LHS. Thus, no real $x$ satisfies $\eqref{f’2}$, that is, no $x$ is a critical point of $f$.

It follows that $f$ has a single root in $(0,1)$, so half of the original claim is proved.

Case 2: $k<k_c$

Once again, consider $\eqref{f’2}$. Let $v(a)$ be the function of $a$ given by the RHS, with $a \in \left(0, e^{-e}\right)$. We have that $$v’(a)=\frac{2}{{\ln(a)}^2}\cdot(\ln(-\ln(a))-1)$$ which is always positive for $a$ in the given range. Hence, $v$ is strictly increasing and for all $a \in \left(0, e^{-e}\right)$ it holds that

$$0=\lim_{x\to 0^{+}}v(x) < v(a) < v\left(e^{-e}\right) = 2e^{-1}$$

In particular, observe that $v(a)$ is always less than $1$, that is, $v(a)$ is always less that $y(0),y(1)$.

On the other hand, $y$ attains its minimum at $x=\tfrac12 v(a) \in \left(0,e^{-1}\right)$. Because in case 2 $\ln(-\ln(a))> 1$, the minimal value of $y$, given by $\frac{\ln(-\ln(a))+1}{-\ln(a)}$, is less than $v(a)$.

Graphically, it looks something like this:

enter image description here

The graph in red is $y(x)$, the blue line is the height of $2e^{-1}$ and the green line is the height of $y$’s minimum. We showed that the image above is ‘typical’, meaning that the minimum is always in $(0,1)$, the green line is always below the blue line, and $v(a)$ is always some number between the two lines. Notice that the extremes of the red graph are always $1$ or greater.

It follows that in case 2 there are exactly two solutions to equation $\eqref{f’2}$, that is, $f$ has exactly two critical points. With our previous observation, this means $f$ has either one root or three.

To complete the proof of the claim, we show that the derivative of $f$ at $x_0$ is positive (remember $x_0$ is the root of $f$ given by $x_0=a^{x_0}$). In other words, for $x<x_0$ near $x_0$, $f$ is negative; and for $x>x_0$ near $x_0$. $f$ is positive. Because $f(0)$ is positive and $f(1)$ is negative, this means $f$ has a root before $x_1$ before $x_0$ and a root $x_2$ after $x_0$, which completes the proof claim.

Indeed, using $x_0=a^{x_0}$, the expression for $f’(x_0)$ (see $\eqref{f’}$) can be simplified to ${\left(x_0\cdot \ln(a)\right)}^2-1$. Because $\ln(a)<0$, we have that \begin{align} & {\left(x_0\cdot \ln(a)\right)}^2-1>0\\ \Longleftrightarrow\,\, & {\left(x_0\cdot \ln(a)\right)}^2>1\\ \Longleftrightarrow\,\, & x_0\cdot\ln(a) < -1\\ \Longleftrightarrow\,\, & a^{x_0} < e^{-1}\\ \Longleftrightarrow\,\, & x_0 < e^{-1} \end{align}

Now, consider $r(a)=a^{\frac{1}{e}}-\frac{1}{e}$. Its derivative is positive for $a>0$, so it is strictly increasing in that range. Now, notice that $r(e^{-e})=0$, so for $0<a<e^{-e}$ (that is, when $k<k_c$) it holds that $r(a)<0$.

Finally, remember that $x_0$ is the only root of $g(x)=a^x-x$ in $(0,1)$. We have that $g(0)=1>0$ and $g\left(\frac{1}{e}\right) = r(a) < 0$. It follows that $x_0$, the root of $g$, happens between $0$ and $\frac{1}{e}$, ie, $x_0<\frac{1}{e}$ as was to be shown.

The claim is thus fully proved.


What the claim means is that for $k>k_c$ the typical situation looks like this:

enter image description here

In red is the graph of $h(x)=a^{a^x}$ and in blue the line is $y=x$. Thus, $a_1=h(0)$ and $a_2=h(1)$, that is, the subsequence $\left(a_{2n-1}\right)$ starts from the bottommost point of the red graph, and the subsequence $\left(a_{2n}\right)$ starts from the topmost of the blue graph. It’s clear that in this situation, both sequences are funneled into the only intersection between the graphs, so in this case $a_n$ converges and $L=L’$.

Conversely, for $k<k_c$ the typical situation looks like this:

enter image description here

Now we can see that $\left(a_{2n-1}\right)$ is funneled into $x_1<x_0$ and $\left(a_{2n}\right)$ is funneled into $x_2>x_0$ and hence $L<x_0<L’$.


What about the critical case $k=k_c$?

When $k=k_c$, we have $a=e^{-e}$, so equation $\eqref{f’2}$ becomes

$$e^{-ex}+x=\frac{2}{e}$$

We use the same idea of looking at the minimum of the LHS. It occurs on $x=\frac{1}{e}$, with value $\frac{2}{e}$. Hence, $f$ has a single critical point, at $x=\frac{1}{e}$. Moreover, by inspection $f\left(\frac{1}{e}\right)=0$, so the single critical point of $f$ is a root of $f$. Because $f(0)>0$ and $f(1)<0$, it’s easy to see that this implies $f$ has no other roots in $(0,1)$. This means the critical case behaves much like the case $k>k_c$: $L=L’$ and $a_n$ converges.

For reference, a picture of the critical case:

enter image description here