Show that $\lim\limits_{n\rightarrow\infty}(nb^n)=0$ for $0<b<1$.
I have seen a couple different proofs of this using logarithms, the Binomial Theorem, and Bernoulli's Inequality. Although there are duplicates of this question on this site, I'd like to check my method of proof for rigor:
Proof:
Since $b\in(0,1)$, we can write $\frac{1}{\sqrt{b}}=1+d$ for some $d>0$. By Bernoulli's Inequality, we know $(1+d)^n>1+nd\space\space\forall n\in\mathbb{N}$. Rearranging our first equation for $b$ we get: $$\frac{1}{b}=(1+d)^2\implies b^n=\frac{1}{((1+d)^2)^n}\implies nb_n=\frac{n}{((1+d)^2)^n}$$
Showing $\lim\limits_{n\rightarrow\infty}(nb^n)=0,$ $$|nb^n-0|=nb^n=\frac{n}{((1+d)^2)^n}<\frac{n}{(1+nd)^2}=\frac{n}{n^2d^2+2nd+1}<\bigg(\frac{1}{d^2}\bigg)\bigg(\frac{1}{n}\bigg)$$
Now I will apply the theorem that if $(a_n)$ is a sequence of positive real numbers with $\lim\limits_{n\rightarrow\infty}(a_n)=0$ and $C>0$ and $|x_n-x|\leq Ca_n$ , then $\lim\limits_{n\rightarrow\infty}x_n=x.$ So taking $C=\frac{1}{d^2}$ since $d>0$ and $(a_n)=\frac{1}{n}$, we have that $$|nb^n-0|<\bigg(\frac{1}{d^2}\bigg)\bigg(\frac{1}{n}\bigg)$$
Conclude that indeed $\lim\limits_{n\rightarrow\infty}(nb^n)=0$. $\blacksquare$
Thanks in advance for the help!
Yor proof is correct. But why do you write $d_n$ ? In $\frac{1}{\sqrt{b}}=1+d_n$ the number $d_n $ is independent of $n$.
A simpler proof: by the root test the series $ \sum nb^n$ is convergent, hence $nb^n \to 0$.