I remember reading that the largest prime found so far has about 13 million digits so I'm assuming there are computers that can handle such a number but how high a number can a computer hold in its memory and perform calculations with?
How large a number can a computer handle?
5k Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail AtThere are 3 best solutions below
On
It depends on what you mean by hold in memory and perform calculations. It is easy to use (almost) half the memory of a computer to store one number in binary, use (almost) the other half to store another and add them. The Titan supercomputer has about $2^{49}$ bytes of memory or $2^{52}$ bits. You can then store two numbers of size $2^{2^{51}}$ and add them together.
Of course, we can represent much larger numbers symbolically and operate with them in specific ways. I can just write $2^{2^{1000}} \cdot 2^{2^{1000}}=2^{2^{1001}}$ The previous approach could handle any number of that size. There are many notations for handling large numbers, but they can only handle special numbers of this size. $2^{2^{1000}}$ is a huge number and there is no way to talk about most of the numbers of that size, only ones that have nice representations.
On
Apart from the amount of available memory and other characteristics of your computer, a lot also depends on the meaning of "handle": What exactly do we want to do?
Examples:
(1) If we want to perform primality testing, then millions of decimal digits is the current maximum (in "best" cases!).
(2) However, if we want to find prime factors of a number, then even 500 digits is already way too many. No computer can "handle" that in a reasonable time if we are faced with a worst-case scenario (factoring a semiprime with two roughly equal prime factors).
(3) On the other hand, if we mostly want to perform the four arithmetic operations, then in some scenarios computers can handle billions of digits; see e.g. One billion digits of pi.
All of the above assumed that "computer" is hardware + software. Indeed, we need a software implementation of arbitrary precision arithmetic for tasks like the above.
If, however, we restrict the meaning of "computer can handle" to hardware only, then we are usually left with about 15-30 significant decimal digits, depending on hardware type. (See Extended precision in Wikipedia.)
The largest number depends upon the computer and operating system, and through software tricks can be made quite large. 13 million digits seems fairly small. My Mac desktop machine running Mathematica has a maximum machine number of: $1.79769 \cdot 10^{308}$. Again, there are many tricks for increasing this number, if required.