My friend includes the decimal point as a digit of pi. Is this right? He says the first 5 digits of pi are 3.141 because he counts the decimal point as a digit. I told him that decimal point does not count as a digit. Who is correct, me or my friend? Does the decimal point count as a digit?
2026-03-25 20:35:20.1774470920
On
Does decimal point count as a digit of pi?
985 Views Asked by user626848 https://math.techqa.club/user/user626848/detail At
3
There are 3 best solutions below
0
On
By definition, a digit is defined as
The whole numbers from 0 to 9 and the Arabic numerals representing them, which are combined to represent base 10 numbers.
The decimal point is not a whole number, and thus the first 5 digits of $\pi$ are $3.1415$.
Mostly touching on D.B.'s answer - the decimal point is a notation, it is not a digit. It holds no inherent numeric value.
You might be familiar with how, at least in Europe and some other places, where a number may be represented with a comma in lieu of a decimal point, i.e. $\pi = 3,14159...$. This is again a notation. The key point being that it doesn't change the value of the number whatsover.
In a way, we can even argue that, semantically, removing the decimal point altogether wouldn't change the value, so long as we made a way to determine place value.
To elaborate further, in the base $10$ system, if a number is given by, for example,
$$...d_2d_1d_0.d_{-1}d_{-2}...$$
we can represent the number as
$$\sum_{k\in\mathbb{Z}} d_k 10^k$$
i.e. the sum over all integers $k$, and where $d_k \in \{0, 1, 2, ..., 9\}$. We can even generalize to other bases $b$ by replacing $10$ with $b$, and limiting $d_k$ to be anny of the digits allowed in base $b$.
Notice how, in this sense, there is no "decimal point," this is an equivalent way to look at the digits of a number. Pedantic, not quite as easy to parse, but it's valid. The decimal point arises when we write the first form of a number instead because, namely, without it, how would you tell where $10^k$ has negative powers? We could represent it differently as
$$\sum_{k\in\mathbb{N}_0} d_k 10^k + \sum_{k\in\mathbb{Z}^-} d_k 10^k $$
i.e. splitting it up by the $k \geq 0$ and $k < 0$, and we again abandon the decimal point altogether. It's just that the decimal point is easier to work with in common applications that we don't see sigmas literally everywhere. We can abandon the decimal point, but we don't want to, either.
The decimal point definitely have value - just no value in the sense of being a number. It is not "less than" or "greater than" or "equal to" anything; it holds no place value, and we could abandon it altogether if we wanted to make math really really inaccessible.
But since we can abandon it altogether and not change the value of a number, and of course since the decimal point is not itself a number, it is definitely not a digit of anything.
So $3.0$? Two digits. $3.14159?$ 6 digits.
(P.S. Though the distinction between "character" and "digit" may need to be introduced. "3.0" takes three characters to write after all: a 3, decimal, and 0. Being a digit comes with the implication of having value, however, specifically a number in the set $\{0, 1, ..., b-1\}$ in base $b$ (assuming $b$ is a positive integer). And since a decimal point is not that, not a digit.)
(P.P.S. Though you could even make the argument that how we denote numbers is arbitrary: why do we choose $1$ instead of something else to represent what it does in other words. So you could introduce a system of writing in which the decimal point does have a numeric/place value. But the moral of this entire rant is -- the decimal point, in our way of writing in the Arabic number system, is a notation, not a digit.)