As we all know, $\pi$ is the ratio of a circle's circumference to its diameter. When you divide the circumference by the diameter, the result is $\pi$. But, here's my question:
When you enter the numbers into a PC, main frame, or whatever, what numbers do you use? It seems to me if you do not get them PRECISELY correct, the calculated value of $\pi$ will be incorrect?
Please explain.
Speaking as a computer scientist, it is more standard to use an infinite series to approximate pi. That way, you can control the precision to an arbitrary number of digits (or bits). For applications that do not require high mathematical precision (the majority of computer science), the underlying programming language (e.g. Java) already comes with packages that contain these pre-defined constants. Note that the same holds for any irrational number. To guarantee arbitrary precision in a computational application, you have to use some kind of 'generator', the most common of which is an infinite series, but also computing square roots etc.