So because $x$ is between 0 and 1, limit of $x^n$ is $0$ and limit of $n^2$ is $\infty$ so it's indeterminate.
I can't figure out how to solve it. I would like to not use l'Hopital's rule when solving this.
So because $x$ is between 0 and 1, limit of $x^n$ is $0$ and limit of $n^2$ is $\infty$ so it's indeterminate.
I can't figure out how to solve it. I would like to not use l'Hopital's rule when solving this.
On
If you can use series, then the ratio test for $\sum n^2 x^2$ gives that that series converges when $x \in (0,1)$ and so $n^2 x^2 \to 0$.
On
Note that if I take the binomial expansion of $$1^{n+3}=\left(x+(1-x)\right)^{n+3}= x^{n+3}+ \dots +\frac {(n+3)(n+2)(n+1)}{6}x^n(1-x)^3+\dots$$ I have (all terms are positive) $$1\gt\frac {(n+3)(n+2)(n+1)}{6}x^n(1-x)^3\gt \frac {n^3}6x^n(1-x)^3$$ so that $$\frac 6{n(1-x)^3}\gt n^2x^n$$
You should be able to conclude from there.
This proof, picking out just one term from the binomial expansion, looks to be too wasteful to be truly effective, but it should be obvious how to adapt it to $n^rx^n$ for an arbitrary positive integer $r$. It runs surprisingly smoothly and doesn't use sophisticated machinery, so worth noting, I think, even if other methods are preferred. This kind of trick appears elsewhere too.
Consider writing $$ a_n = n^2 \cdot x^n = e^{\log(n^2)} \cdot e^{\log(x^n)} = \exp( 2 \log(n) + n \log(x) ). $$ Recall that $\exp$ is a continuous, strictly increasing function. So it suffices to consider the limit of the exponent. Note that $n \gg \log n$ when $n$ is sufficiently large. Now think about the sign of $\log(x)$ for $x \in (0,1)$. This can give you the limit of the exponent, so now you just need to convert that into the limit of the original expression.
For this final step, we see that $\log(x) < 0$ for all $x \in (0,1)$, and so $$ b_n = 2 \log(n) + n \log(x) \to -\infty \quad \text{as $n \to \infty$}. $$ This means that for all $K > 0$ there exists an $n_0$ (depending on $K$) such that $$ b_n \le -K \quad \forall \ n \ge n_0. $$ This, in turn, tells us that $$ a_n = \exp(b_n) \le e^{-K} \quad \forall \ n \ge n_0,$$ since $\exp(\cdot)$ is an increasing function.
Now, clearly $a_n \ge 0$ for all $n$. Set $\epsilon = e^{-K}$, ie $K = \log(1/\epsilon)$, and we see the above is equivalent to the following: $$ \forall \epsilon > 0 \ \exists n_0 \text{ s.t. } 0 \le a_n \le \epsilon \ \forall n \ge n_0. $$ This precisely says that $a_n \to 0$ as $n \to \infty$.