Here is the full version of the question: Given $y\in\mathbb{R}$, $n\in\mathbb{N}$, and $\epsilon>0$, show that for some $\delta > 0$, if $u\in\mathbb{R}$ and $|u−y| < \delta$ then $|u^n −y^n| < \epsilon$. [Hint: Prove the inequality when n = 1, n = 2, and then do induction on n using the identity $$u^n-y^n=(u-y)(u^{n-1}+u^{n-2}y+\cdots+y^{n-1})$$
I am trying to solve this question from Pugh. While the question recommends using induction, I don't take advantage of the induction hypothesis. Here is my attempt:
Given $|u-y|<\delta$, the base case is trivial such that $\delta=\epsilon$. Then given any $n\in\mathbb{N}$, choose $\delta=\frac{\epsilon}{|u^{n-1}+u^{n-2}y+\cdots+y^{n-1}|}$. By the given hint, using my definition of $\epsilon$: $$\implies |u^n-y^n|\leq |u-y||u^{n-1}+u^{n-2}y+\cdots+y^{n-1}|<\delta |u^{n-1}+u^{n-2}y+\cdots+y^{n-1}|=\epsilon$$
As you can see, I don't use $|u^{n-1}-y^{n-1}|<\epsilon$. Is this solution valid? Do I have the freedom the define the $\delta$ in the question as I wish?