Why are mathematicians still trying to calculate digits of pi?

880 Views Asked by At

Why do mathematicians still try to calculate digits $\pi$?

In 2019 Haruka Iwao calculated the world's most accurate value of $\pi$; which included $31.4$ trillion digits, far past the previous record of $22$ trillion.

Source

We all know it's impossible to find the end value of $\pi$, what's the point of struggling?

2

There are 2 best solutions below

0
On

Here's a couple of reasons I know of.

Firstly, there's a big open question about the normality of $\pi$. Essentially, mathematicians are trying to prove that the digits of $\pi$ are "random", in a sense. As you get further into the decimal expansion of $\pi$, the frequency of any given string of digits should be approximately the same as the frequency of any other string of digits of the same length. For example, the digit sequence of $015$ should occur roughly the same number of times as $657$ as you look further into the decimal expansion (assuming $\pi$ is normal).

As a corollary, if $\pi$ is indeed normal, every finite sequence of digits should occur in $\pi$'s decimal expansion. Some people outside the mathematics community have taken a keen interest in this fact, judging by the following meme:

enter image description here

(Disclaimer: this is still based on conjecture, and even if $\pi$ proves to be normal, it doesn't mean we can find and extract any of this information from it! Also, none of this follows from being an "infinite, non-repeating decimal".)

By computing large numbers of digits, mathematicians can search through the existing digits to find various digit strings and count them. In this way, we get some empirical evidence of normality. If we start to see that, say $1962$ seems to be ten times more likely than $6668$, then this would give us a reasonable starting point to disproving normality (and open up fascinating questions of which strings are more likely than others and why?).

The second reason I know if is that large scale computations are, in themselves, intellectual achievements. It's not nearly so simple as put an algorithm into a powerful computer and let it run for a long time. You need extreme precision, far beyond standard computer types, in order to guarantee accuracy of digits, which become increasingly computationally expensive to work with. For such large-scale computation, a small error thrown by the hardware is still significant, so you need some kind of redundancy in place to minimise the chance of error being introduced. When presenting this stuff, there are actual papers published regarding the methodology (usually with links to the actual digits produced online).

The point is, $\pi$ itself isn't always of interest; it's the way we tackle the problem of computing it. These techniques can then be applied to other large-scale mathematical computations in other numbers, when the need arises.

0
On

To find a record-breaking number of digits of $\pi$, you have to

  1. know something about $\pi$ that no one else knows, or

  2. know something about programming that no one else knows, or

  3. know something about utilizing computational resources no one else knows.

In any of these events, what you know might be useful in other contexts, so it's worth knowing.