I am given the function: $$ f(x)=\begin{cases} \frac{1}{x}\sin(\frac{1}{x}) & \text{if $x\ne0$} \\ a &\text{if $x=0$} \end{cases} $$
and asked to show that $f$ is discontinuous at 0, no matter what $a$ is. My thought is to find a monotone increasing subsequence of $(x_n)$ where $\sin(x_n)\approx1$ for all terms in the sequence. Then I will have a sequence $\approx(\frac{1}{x_n})$ for increasing $x_n$. Thus this subsequence will be diverging to infinity which means $(f(x_n))$ has no limit and thus $f$ is discontinuous at $0$.
Is this a good approach to the problem? If I found a sequence that satisfies the above, would I indeed be done?
How would I go about constructing this sequence is practice? Is there a general way to do something like that for these problems or do they completely vary?
Thank you for any help! I'm pretty new to real analysis and thinking of these objects in such a rigorous manner so I'm still trying to wrap myself around the general procedures.
Your idea to approach the problem with sequences is correct.
Take the sequence $x_n=\frac{1}{2n\pi+\frac{\pi}{2}}$
Then $f(x_n) \to+\infty$
So for this particular problem,the idea is to construct a sequence $x_n$ such that $f(x_n) \to +\infty$ or $-\infty$ or $f(x_n)$ does not have a limit.(For instance it may have two subsequences that converge to different numbers)