I am struggling to understand how to prove that a function is discontinuous using the sequential definition. Here is a particular example from my textbook where some clarification might help.
Let f(x) = $\frac{1}{x}\sin\frac{1}{x^2}$ for $x\neq0$ and $f(0)\neq0.$ Show f is discontinuous, i.e., not continuous at 0.
Part of the solution from book:
It suffices for us to to find a sequence $(x_{n})$ converging to 0 such that that $f(x_{n})$ does not converge to f(0)=0.
So we will arrange for $\frac{1}{x_{n}}\sin\frac{1}{x_{n}^2} = \frac{1}{x_{n}}$
The solution then goes on to solve for $x_{n}$. My question is with part 2. Why do we set our function equal to $\frac{1}{x_{n}}$? I'm confused by the choice of function I guess. I'm trying to understand the logic in choosing this, and whether or not a more general "rule" can be applied to proving discontinuity using the sequential definition.
The general idea for finding such "counterexamples" for continuity is to look at the functuion and (roughly) understand how it behaves.
In this case, we have the factor $\frac1x$ that will tend to $\infty$, as $x\to 0$. OTOH, we have the factor $\sin(\frac1{x^2})$, which will oscillate faster and faster between $1$ and $-1$ as $x\to 0$.
Since we want to make it such that their product is not tending to $0$, and $\frac1x$ will tend to $\infty$, an 'obvious' choice it make sure that the other factor doesn't go to $0$ but stays above a certain threshhold (which ensure that their product tends to $\infty$).
One simple way to ensure this is to simply make $\sin(\frac1{x_n^2})=1$.