Consider any three consecutive positive integers. Prove that the cube of the largest cannot be the sum of the cubes of the other two.
Work: I tried to prove via contradiction.
I made three integers, k, k+1, and k+2. Then set the equation$(k+2)^3 = (k+1)^3 + k^3$
This equation is then expanded to $k^3+6k^2+12k+8 = 2k^3+3k^2+3k+1$ Ultimately, I get to a dead end $k^3-3k^2+9k = 7$.
I do not know where to go from this and it just seems that I took the wrong route in the beginning. I am currently learning about linear combinations and the division algorithm as well as Euclidean Algorithm, but I do not see anyway I can use those on this problem.
You have a cubic (actually, you made an arithmetic error, it should be $k^3 - 3k^2 -9k = 7$); rewrite it as $k^3-3k^2-9k-7=0$. You want integer roots for the cubic. Any integer roots must be factors of $7$, so there are only four possibilities, and checking them, there are no roots.
By the way, the arithmetic is slightly easier if you use $k+1$, $k$, and $k-1$.