When proving the root of prime numbers such as $\sqrt 2$ (by proving how $(p/q)^2 = 2$ isn't possible for integers p and q) the proof ultimately came down to $p^2 = 2q^2$ and because 2 is a prime number p had to be a factor of 2. My question is if that is still the case when proving the irrationality of roots of nonprime numbers such as $\sqrt(27)$. Does $p^2 =27*q^2$ imply that $p=27*r$ and would this work for all nonprime numbers that aren't squares of another number?
For most proofs I saw online, people used $p=27*r$ but I don't understand why P has to be a multiple of 27. From my experimentation, I believe that $p=9*r$ would still work for the proof. Please let me know if this proof is valid or if there are any mistakes to this.
$(p/q)=\sqrt27$
$p^2/q^2=27$
$p^2=27*q^2$
$p=9*r$ (Where R is an integer)
$81*r^2=27*q^2$
$q=3*r$
Because P is a multiple of 9 and q is a multiple of 3, this contradicts the original statement that $p/q$ is a simplified rational fraction.
Yes, a similar approach can be used whenever $n$ is not a square of an integer.
Since $n$ is not the square of an integer there must be some prime factor that appears an odd number of times in $n$, but all prime factors of $p^2$ and $q^2$ appears an even number of times which is a contradiction to unique factorization since RHS contains that prime an odd number of times and LHS even.