In the real world, do we ever need to worry about convergence and what not? I am not talking about whether recursive functions and such terminate, but convergence in analysis. It seems like the finitude of the universe makes questions like that meaningless. I ask because it often seems like physicists and statisticians are very lax about convergence. I know physicists might seem to care about it every once and a while (wave functions must be in normalizable i.e. in $L^2$) but it doesn't appear to be truly important.
So what are some real world reasons for concerning ourselves with convergence?
Whenever you use a numerical method to approximate something, you'd like to know that your numerical answer will be close to the actual value. A common situation is that the numerical approximation is $A(n)$ where $n$ is a parameter (e.g. the number of steps that are used). If the true answer is $T$, you'd like to know that $\lim_{n \to \infty} A(n) = T$, which says that you can ensure that your approximation is as close as desired to the true answer by taking $n$ large enough.
Of course you'd really like to have more detailed information (i.e. for a given tolerance $\epsilon$, how large to take $n$ in order to have $|A(n) - T| < \epsilon$), but the fact that the limit is $T$ is a good start - if it was not true, it would mean that if you want really good approximations you should look for different methods.