It seems since a classical computer has a finite number of transistors, it can only represent numbers up to a finite value. So a classical computer couldn't start at 1 and then count up to an arbitrary number. Can anyone confirm this or provide reasons otherwise?
Can a classical computer count beyond a finite number?
120 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail AtThere are 2 best solutions below
On
Yes, for any actual physical computer, there is a finite number beyond which it will not be able to count, because there's only finitely many physical states it can be in.
I'm assuming you're also, at least a little bit, asking a different question:
If all physical computers are finite, why do we model them with things like Turing machines, which have infinite amounts of memory?
That's because at the stage when we're using mathematical models like Turing machines, the memory size of physical computers is irrelevant. Generally:
- The first constraint on what computers can do is that some problems are undecidable: even a Turing machine will never be able to tell you the answer.
- Many actual decidable problems are more limited by time than space.
- Usually, algorithms scale, so that the same idea works on problems of any size, even if some problems can't be solved on physical computers today.
We do sometimes consider memory limitations. (For example, PSPACE is the set of all problems for which the memory needed to solve a length-$n$ instance is bounded by some polynomial in $n$.) But even then, we can just take a model with infinite memory, and then once it's done solving a problem, ask: how much memory did it use?
The counting limitation for a classical computer results from available memory, not from available processing power. For example, there are simple algorithms that can sequentially read the digits of an arbitrarily long binary number N and simultaneously write the digits of the number N+1. Rinse and repeat and your computer can count forever (in principle). There is no need to store the whole of N in the computer's memory so processing the algorithm is not limited by the 'size' or complexity of the computer itself.
Where the limitation is, is in the space required to store the 'written' digits. If you had unlimited external memory then in principle the computer could read and write (count) forever, but in practice there is a limit in the external memory based on how much data you can store in all the matter you have available.
So, in practice, the limitation is not in the computer itself - but in the storage limit of the visible universe.