The arguments I've seen for certain square roots being irrational (like $\sqrt2$) boil down to proving by contradiction that if we assume p/q to be fully reduced we'll find that both p and q must be even (in the case of $\sqrt2$) and :boom: contradiction.
But if I allow p to have an infinite number of digits (a valid member of the integers) then it has no last digit and cannot be classified as even or odd. Right? The proof by infinite descent, for example, seems to assume we can't descend infinitely but why not? I must be missing something important here about the integers.
Similarly, why not define pi as the fraction 314159.../100000...? I believe both are valid integers.
I've often heard people loosely describe rational numbers as the set of numbers whose decimal representations terminate or repeat, but I don't see what limits the construction of an arbitrarily large integer by an infinite process. Lets for example attempt to construct the numerator in the quotient above:
Step 1: Construct the number 3 (1+1+1 = 3)
Step 2: Construct 31 (3+1+1...+1 = 28)
Step 3: Construct 314 (you get the idea)
It seems I could continue the process above indefinitely and produce a valid integer at every step. Because it is an infinite process any one step has a finite integer, but the number this process describes is unlimited.
Is the number this process describes then not an integer?
Update As commenters and answerers of this question have pointed out, my question assumes that a number with an infinite number of digits could be called an integer. That underlying assumption has confused some readers of my question so I've edited the above in an attempt to make the meat of my question clearer.
To that end, I'd like to point out that the essence of my question is either very similar to or a duplicate of A "number" with an infinite number of digits is a natural number?.
Neither $314159\ldots$ nor $100000\ldots$ are valid integers. Integers are characterized by being finite sequences of digits. Integers cannot be infinite; that's exactly what we mean by the word "integer".
The integers are counting numbers (and their negatives); that is to say, they are numbers one might use to count. You can't count to $100000\ldots$ - for one thing, there's no "previous" number to count from. The key idea is that the integers are the numbers we can "see"; I can show you $43$ sheep, and in principle I could show you $43000000000$ sheep, but I can't show you a convincing $\pi$ sheep. Rational numbers are numbers we can build using comparisons between the numbers we can "see"; Jack can have $1.5$ times as many sheep as Jill, but he can't have $\sqrt{2}$ times as many sheep as Jennifer, no matter what number of sheep Jennifer has. Allowing integers to have infinitely many digits would defeat this "concrete" idea of what an integer is, so we define integers to be only finite.