I'm a high school student, so please point out my mistakes nicely and in layman's terms :) Thanks!
Ok. Beal's Conjecture: If
$$a^x+b^y=c^z$$ where $a$, $b$, $c$, $x$, $y$, $z$ are whole numbers; $x, y, z > 2$ ; and $a$, $b$, and $c$ are any natural number (now presumably WAY over than 2 based on counter-example searches); then a, b, and c must have at least one common factor.
Let's begin!
If x, y, and z are larger than 2, then the equation can be written thusly: $$a^2a^p+b^2b^q=c^2c^r$$ p, q, r being whole numbers.
Dividing by $c^r$ and taking the square root yields: $$a^2a^p/c^r+b^2b^q/c^r=c^2$$ $$a(a^p/c^r)^{1/2} +b(b^q/c^r)^{1/2}=c$$ Now, for a bit of logical analysis: If $r$ is not even, and thus $c^r$ is not a square, then $c$ would then be irrational because $c^r$ does not have a common factor with either $a^p$ or $b^q$, and thus, even if $a^p$ and $b^q$ were square, this would contradict the conjecture. The same reasoning applies to $a$ and $b$ themselves. Could $a^p, b^q, c^r$ be all not square? Nope. $$(m)^{1/2} + (n)^{1/2}= I$$ If $m$ and $n$ are not perfect squares and have no common factor, and I is an integer, this equation is impossible as $$m^2 + n^2 + 2(mn)^{1/2}=I^2$$ $2(mn)^{1/2}$ is obviously not rational, with our prerequisites. Therefore as the same reasoning occurs in the instance of $a^p$, $b^q$, and $c^r$ being non-squares, we can discard it, and we are left with the conclusion that they are all squares.
Meaning we can rewrite the original theorem from this: $$a^2a^p + b^2b^q=c^2c^r$$ into $$a^{2(e+1)} + b^{2(f+1)}=c^{2(g+1)}$$ where
$p=2e$
$q=2f$
$r=2g$ AND at the same time from the previous equation:$$a(a^p/c^r)^{1/2} +b(b^q/c^r)^{1/2}=c$$ into $$a(a^e/c^g) + b(b^f/c^g)=c$$ translating into $$a^{e+1} + b^{f+1}=c^{g+1}$$ Compare the transformations of the above two original equations. If we blend the results of both of them together that would mean:$$a^{2e+2} + b^{2f+2}=(a^{e+1} + b^{f+1})^2$$ Which is:$$2a^{e+1}b^{f+1}=a^{2e+2}-a^{2e+2}+b^{2f+2}-b^{2f+2}$$ $$2a^{e+1}b^{f+1}=0.$$ $$ab=0$$ Therefore if at least one of them must be zero, this contradicts the prerequisite that it must be a natural number. Proving Beal's million-dollar conjecture. I know it seems so easy, so it seems there must be some mistake.