My professor told me today that while my logic was good and all my steps after the assumption were correct, my argument was invalid because I was assuming what I want to prove. I don't see how. Neither do any of my peers. If someone could explain it to me, I would be greatly appreciative.
Using the axioms of ring theory, prove $b*0 = 0$.
First, I proved that 0*1 = 0. Then I did the following.
$Proof$: Suppose $a*b=a $ for all $b \in \mathbb{R}$. (This is where he said I made the error)
By Additive Identity we have $(a+0)*b = a$.
By Distribitive Property we have $a*b + 0*b = a$.
By assumption, $a*b = a$, so we have $a+ 0*b = a$.
So $(-a)+a+0*b = (-a)+a$. By Associativity, we have $(-a+a)+0*b = (-a+a)$.
Then, $0+0*b = 0$ by Additive Inverses.
So $0*b = 0$ by Additive Identity.
By Commutativity of Multiplication, $0*b = b*0$, so $b*0 = 0$.
As I have been taught, we are allowed to make a basic assumption when writing proofs and then see what follows. What I have done is prove the following statement.
If $a*b =a$, $\forall b \in \mathbb{R} $, then $a=0$.
I don't see how this is different than the hint we were given: If $a+b=a,$ then $b=0.$ In both cases, we are allowed to assume the if statement is true and then show that the result follows. After talking to a lot of my peers, we are all of the same mind. I think we would all benefit from a discussion as to why we cannot do this.
You didn't show the asked property, all you've said is if there is an element "a" with such property then this element must be zero.
Here is a simple proof of it in terms of basic ring properties:
$$b\cdot 0 = b(0 + 0)= b\cdot 0 + b\cdot 0$$ $$\implies -(b\cdot 0) + b\cdot 0 = (-(b\cdot 0) + b\cdot 0) + b\cdot 0$$ $$\implies 0 = 0 + b\cdot 0= b\cdot 0$$ $$\implies 0=b \cdot 0$$