I was able to come up with a proof for this problem however, it seems like my argument can work for any field of even order and not just odd powers of 2 so I'm convinced there is something wrong here. Can someone verify or see where the error in reasoning is?
Problem: Let $F$ be a field with $2^n$ elements, with $n$ odd. Show that for $a,b \in F$ that $a^2+ab+b^2=0$ implies that $a=0$ and $b=0$.
Proof: Suppose $a,b \in F$ and $a^2+ab+b^2=0$.
$\implies a^2+2ab+b^2 = ab$
$\implies \frac{2^n}{2}(a^2+2ab+b^2) = \frac{2^n}{2}ab$
$\implies \frac{2^n}{2}a^2+ 2^nab+\frac{2^n}{2}b^2 = \frac{2^n}{2}ab$
$\implies \frac{2^n}{2}a^2+\frac{2^n}{2}b^2 = \frac{2^n}{2}ab$ (since F is a group under addition, then every element to the $|F|$ multiple is the identity thus $2^n(ab) = 0$)
$\implies \frac{2^n}{2}(a^2+b^2) = \frac{2^n}{n}ab$
$\implies a^2+b^2 = ab$
$\implies a^2-ab+b^2 = 0 = a^2+ab+b^2$
$\implies -ab = ab \implies 2ab=0 \implies ab=0$.
Thus, $a=0$ or $b=0$. However, if just one of them is zero, then so is the other ($a=0 \implies a^2+ab+b^2 = 0 \implies b^2 = 0 \implies b=0$). Thus, $a=0$ and $b=0$.
QED
Anyways, if there is something wrong with this proof, could someone give me a subtle hint perhaps? I've been stuck on this seemingly simple problem for awhile now.
The mistake you make is:
$$\frac{2^n}{2}(a^2+b^2) = \frac{2^n}{2}ab \Rightarrow a^2+b^2 =ab $$
Note that your field has characteristic $2$, which means that $\frac{2^n}{2}=0$! You divide again by $0$ in the last line.
Hint $$a^3-b^3=(a-b)(a^2+ab+b^2)=0$$
Thus $a^3=b^3$, and you also know what $a^7, b^7$ are....
You asked for a subtle hint, I didn't include more details, let me know if it is helpful, or you want more details.