To my understanding, the Taylor series is a type of power series that provides an approximation of a function at some particular point $x=a$. But under what circumstances is this approximation perfect, and under what circumstances is it "off" even at infinity?
I realize is a little hazy so I'll rephrase: By "perfect" I refer to how a regular limit doesn't ever actually reach something but instead provides a sort of error term that you can make as small as you want, so for all practical purposes we treat it as zero error. Whereas for something that is an imperfect approximation maybe that arbitrarily small error piece doesn't exist, or maybe the function is only correct for that particular point and nowhere else, etc.
So maybe what I am asking is when the Taylor series provides an equivalent representation of the function over all $x$ in $f$'s domain, and when does it not? And when it doesn't, how do we even know?
Limits are exact
You have a misunderstanding about limits! A limit, when it exists is just a value. An exact value.
It doesn't make sense to talk about the limit reaching some value, or there being some error. $\lim_{x \to 1} x^2$ is just number, and that number is exactly one.
What you are describing — these ideas about "reaching" a value with some "error" — are descriptions of the behavior of the expression $x^2$ as $x \to 1$. Among the features of this behavior is that $x^2$ is "reaching" one.
By its very definition, the limit is the exact value that its expression is "reaching". $x^2$ may be "approximately" one, but $\lim_{x \to 1} x^2$ is exactly one.
Taylor polynomials
In this light, nearly everything you've said in your post is not about Taylor series, but instead about Taylor polynomials. When a Taylor series exists, the Taylor polynomial is given simply by truncating the series to finitely many terms. (Taylor polynomials can exist in situations where Taylor series don't)
In general, the definition of the $n$-th order Taylor polynomial for an $n$-times differentiable function is the sum
$$ \sum_{k=0}^n f^{(k)}(x) \frac{x^k}{k!} $$
Taylor polynomials, generally, are not exactly equal to the original function. The only time that happens is when the original function is a polynonial of degree less than or equal to $n$.
The sequence of Taylor polynomials, as $n \to \infty$, may converge to something. The Taylor series is exactly the value that the Taylor polynomials converge to.
The error in the approximation of a function by a Taylor polynomial is something people study. One often speaks of the "remainder term" or the "Taylor remainder", which is precisely the error term. There are a number of theorems that put constraints on how big the error term can be.
Taylor series can have errors!
Despite all of the above, one of the big surprises of real analysis is that a function might not be equal to its Taylor series! There is a notorious example:
$$ f(x) = \begin{cases} 0 & x = 0 \\ \exp(-1/x^2) & x \neq 0 \end{cases} $$
you can prove that $f$ is infinitely differentiable everywhere. However, all of its derivatives have the property that $f^{(k)}(0) = 0$, so its Taylor series around zero is simply the zero function.
However, we define
"Most" functions mathematicians actually work with are analytic functions (e.g. all of the trigonometric functions are analytic on their domain), or analytic except for obvious exceptions (e.g. $|x|$ is not analytic at zero, but it is analytic everywhere else).