Just to clarify, the limit of $x \nearrow 0$ from the left of $1/x$, would be $-\infty$, and the limit of $x \searrow 0$ from the right of $1/x$, would be $+\infty$ right?
This is only true when its $1/x$ and not any other number over $x$? Sorry if this is confusing, are there certain formulas to know when the limit equals infinity?
Like the limit of $x \to 0$ of $1/x^2 = +\infty$
Thanks for any help, again sorry if this is confusing. I'm just trying to understand how to know when a limit equals infinity, rather then it does not exist.
To say that $\lim_{x \to a} f(a)$ exists, you have to check that both
$$\lim_{x \to a^+} f(a) \quad \textrm{ and } \quad \lim_{x \to a^+} f(a)$$
exist, and that they are equal.
To get a feel for these, it's perfectly fine to imagine plugging in a series of smaller and smaller values and seeing the result, i.e. a table of values.
For $\lim_{x \to 0} \frac{1}{x^2}$, let's start with $\lim_{x \to 0^+} \frac{1}{x^2}$. What do you get if you plug in $x = 0.1$, then $x = 0.01$, then $x = 0.001$? What does this indicate about the limit?
Now, let's work with $\lim_{x \to 0^-} \frac{1}{x^2}$. Plug in $x = -0.1$, then $x = -0.01$, then $x = -0.001$. What limit do you get this time?
Do these two limits agree? If so, then what can we say?
(note: we haven't rigorously proved that these limits are what we claim, but I'm guessing you're in a calculus course where this type of rigor isn't expected.)