prove a recurrence relationship grows unbounded

226 Views Asked by At

So I am doing this problem:

Let $p_{n+1}=\sqrt{(p_n)^2+(1/p_n)^2}$.

Prove that this sequence will grow unbounded.

I am really sure about this will grow unbounded; however I can't find a way to use the formal definition of a bounded series and use contradiction to prove this together;

I have for every $p_n$ there exist a $m$ such that $m>p_n$ and set up $m>\sqrt{(p_n)^2+(1/m)^2}$ but I think I cant proceed from here. Some suggestion will be appreciated. (I don't really prefer a derivative as I hope to do this with pre-calculus level or lower.)

1

There are 1 best solutions below

0
On BEST ANSWER

There is a general fact you are using, not sure if you have seen a correct proof: Given some real number $x_0,$ and a real function $f(x)$ with $f(x) > 0$ for $x \geq x_0,$ but $f'(x) < 0$ for $x \geq x_0.$ We do not explicitly use the derivative of $f,$ we just use the fact that it decreases.

Theorem: Given a sequence beginning at $x_0,$ with $$ x_{n+1} = x_n + f(x_n) \; , $$ it follows that $x_n$ increases without bound.

Proof: Assume there is an upper bound $U > x_0.$ Indeed, let us get ambitious, let $L > x_0$ be the least upper bound of the sequence elements. Thus there is some $x_n > L - f(L).$ If $x_n = L,$ then $x_{n+1} > L.$ If $L - f(L) < x_n < L,$ then $f(x_n) > f(L),$ so $$x_{n+1} = x_n + f(x_n) > L - f(L) + f(x_n) > L - f(L) + f(L) = L.$$ Either way, we have a contradiction of the existence of an upper bound.

I'm pretty sure something like this is an exercise in Rudin. Notice that we are saying that $$ x_{n+1} = x_n + e^{-x_n} $$ increases without bound. Go Figure.