Let's say I have a Taylor series approximation, $p(x)$, of a function $f(x)$ at $a$:
$$ p(x)=\sum_{n=0}^\infty{\frac{f^{(n)}(a)}{n!}(x-a)^n} $$
And that this Taylor series has a radius of convergence of $r$.
Does the radius of convergence mean, that $\forall x\in(a-r,a+r)$, $p(x)$ approximates $f(x)$ perfectly (assuming that there are infinite terms in the series)? So that as long as $x$ is in the interval of convergence, $p(x)=f(x)$? Also, is there some other meaning to the radius/interval of convergence (in this context)?
No, convergence does not mean that the series is exact there. For example, the function which is $0$ for $x = 0$ and $e^{-1/x^2}$ for $x \neq 0$ (which is infinitely differentiable) has, when expanded at $x=0$, a Taylor series which has infinite radius of convergence and yet which doesn't approximate the function very well at all.
Sort of conversely, the standard Taylor series expansion for $\ln(1+x)$ diverges once $x > 2$, but we may center our Taylor series differently and it will converge (and equal $\ln(1+x)$). This property is called being analytic.
But if the series converges and the remainder estimates go to $0$ in the limit (as you add more terms), then the Taylor series is equal to the original function.
I wrote a blog post for some of my calculus students about Taylor series before. In section $3$ I introduce a different function whose Taylor series doesn't approximate the function itself, and discuss it a bit.