It's easy to prove that there's more real numbers between 0 and 1 than there are integers, so in what way is this wrong?
For any number between 0 and 1, you can write it as $0.abcde\ldots$ (where $a, b, c\ldots$ are digits). This number "corresponds" to $2^a3^b5^c7^d11^e\cdots$ (the primes as bases and the digits as exponents). Because of unique prime factorization, every real number corresponds to a different integer. So that would mean that the amount of integers >= the amount of real numbers between 0 and 1.
Which is obviously not true. What am I doing wrong here?
Irrational numbers have an infinite decimal expansion, which would mean that number would "correspond" to an infinite product in your coding. natural numbers have all have a FINITE representation as a product of primes.