I had the following - apparently straightforward - question on one of my past assignments:
Show that a field has no other ideals except $\{0\}$ and the field itself.
This was the proof I gave:
Let $I \ne \{0\}$ be an ideal of a field $R$. There must therefore exist a nonzero $a \in I$, and so for every $b \in R$ we have $a(a^{-1} b) \in I$ by definition of an ideal since $a \in R$ is invertible so $a^{-1} b \in R$.
It follows $b \in I$ for all $b \in R$ so $I = R$, and therefore $\{ 0 \}$ and $R$ are the only ideals of $R$.
But when I got it back the part "$a \in R$ is invertible and $a^{-1} b \in R$" was underlined saying it "does not work". But why not? As far as I know all nonzero elements in a field are invertible under multiplication, multiplication in a field is closed, and multiplication in a field is also associative. What has gone wrong?
Can someone look over my solution and explain where I made a mistake? Thanks!
You were done the second you said $a(a^{-1}b) \in I$. Since $b$ was chosen to be arbitrary, it meant you had just shown $R \subseteq I$ at which point you could have concluded that $R = I$.
Perhaps they made the same mistake I made the first time reading it through, I saw
which of course isn't true if you had chosen $b = 0$.
But that's just speculation.
It appears your grader just made a mistake. It happens.