Correctness of proof of boundedness. 6.5.5. Abbott's Understanding Analysis 2nd ed.

304 Views Asked by At

The section is about power series. The question reads:

a) If $s$ satisfies $0<s<1$, show that $ns^{n-1}$ is bounded for all $n \geq 1$.

b) Given an arbitrary $x \in (-R,R)$, pick $t$ to satisfy $\lvert x \rvert < t < R$. Use this start to construct a proof for Theorem 6.5.6. [This theorem says, roughly, that if a power series converges for all $x \in (-R, R)$, then the differentiated series converges too].

My question is about numeral a), but I include the rest for context.

My solution:

By induction. Take $n=1$ as the base case. Then we have that $ns^{n-1}=1$, which is obviously bounded. Now assume that it holds for $n$. That is, there exists an $M \in \mathbb{R}$ so that $ns^{n-1} < M$. Since $0<s<1$, we know $sM$ is finite. Additionally, $0<s^{n}<1$, so $s^{n}$ is also finite. Then, from the assumed inequality, we have $ns^{n-1}s + s^{n} = (n+1)s^{n} < sM +s^{n}$. The right hand side of the inequality is clearly finite, so the case also holds for $n+1$, proving the desired result.

Is this correct? the suggested solution in the book uses the ratio test, which I find a bit of an overkill in this scenario. There is also this proof from MSE.

3

There are 3 best solutions below

1
On BEST ANSWER

The ratio test is the most natural way. For a direct proof, write $s = 1/(1 + h)$ with $h > 0$. Then by the binomial theorem, $ns^n \leq \frac{n}{1 + nh + \frac{n(n - 1)}{2}h^2} \to 0$ as $n \to \infty$.

0
On

Prove $0<s<1 \implies \forall N, Ns^{N-1}<M(s)$

Suppose $0<x<y<1$. Then $0<x^n<y^n<1$.

$y^n-x^n=(y-x)(y^{n-1}+xy^{n-2}+x^2y^{n-3}+...+x^{n-1})<(y-x)n$

So for any $\epsilon>0$, $\delta=\epsilon/n \implies (|x-y|<\delta \implies |x^n-y^n|<\epsilon)$

So $f(x)=x^n$ is uniformly continuous on $(0,1)$.

$\frac{1}{1-s}=1+s+s^2+s^3+...$

We are allowed to differentiate both sides because of uniform continuity.

$\frac{1}{(1-s)^2}=1s^0+2s^1+3s^2+...$

Every $Ns^{N-1}$ appears as a term on the right, so $Ns^{N-1}<\frac{1}{(1-s)^2}$

Alternatively, one can find the maximum value of $g(x)=xs^{x-1}$. That gives a tighter bound of $\frac{-1}{e\cdot s \cdot \ln s}$.

1
On

Let us observe that the sequence $a_n=ns^{n-1}$ eventually starts to decrease after a certain point. We have $a_n>a_{n+1}$ if $$ns^{n-1}>(n+1)s^n$$ ie $$\frac{n} {n+1}>s$$ ie $$\frac{1}{n+1}<1-s$$ ie $$n>\frac {1}{1-s}-1$$ Thus if $$N=\left\lfloor\frac{1}{1-s}\right\rfloor $$ then we have $a_n>a_{n+1}$ for all $n\geq N$. Let $$M=\max(a_1,a_2,\dots,a_N)$$ then we have $a_n\leq M$ for all $n$ and the sequence is bounded above. And it is obviously bounded below as its terms are positive.