This question had two preceding parts. First, to prove that the inverse of an element $a$ is unique and second, to prove the converse of this statement. I.e. if $a$ and $b$ have inverses, then so does $a*b$. I didn't have much of a problem with both of these but this third part has me stumped.
At first, I believed, it was true and tried to work from the definition of an inverse: $(a*b)*c=c*(a*b)=e$ Using the fact that $*$ is associative, we have $a*(b*c)=e$ and $(c*a)*b=e$ which is half of the way to showing $a$ and $b$ have identities. However, I then became completely stuck on trying to show that $b*c*a=e$ (brackets omitted due to associativity) which doesn't seem to be true without commutativity.
I then assumed the statement was false and went looking for counterexamples. I tried the composition of functions and matrix multiplication as both are associative but not commutative. In the case of functions, a function, $g•f$, must be bijective to have an inverse. This combined with other theorems implies that $g$ and $f$ are bijective. Thus, the statement holds here. A similar thing arises when looking at matrix multiplication where multiplying a non-invertible matrix $A$ by any other matrix $B$ gives a non-invertible matrix $AB$. As both counterexamples failed, I am well and truly stuck and have no idea how to continue with this problem.
Thanks for any help.
You're right, the question is false, and looking for an example with functions is a good idea. The idea below is to find an injective and a surjective function, both of which are not bijective, which composition is the identity map.
Let $X:=(\mathbb{R}^\mathbb{N})^{\mathbb{R}^\mathbb{N}}$ be the set of maps from $\mathbb{R}^\mathbb{N}$ to itself, where $\mathbb{R}^\mathbb{N}$ is the set of sequences of real numbers, and let $*$ on $X$ be the composition $\circ$. $(X,*)$ is associative and has an identity element which is the identity map \begin{align} e:\mathbb{R}^\mathbb{N}&\to \mathbb{R}^\mathbb{N}\\ x&\mapsto x. \end{align}
Define $a$ and $b$ as follows: \begin{align} a:\mathbb{R}^\mathbb{N}&\to \mathbb{R}^\mathbb{N}\\ (x_1,x_2,\dots)&\mapsto (x_2,x_3,\dots), \end{align} i.e $a$ is the map that removes the first element of the sequence, and \begin{align} b:\mathbb{R}^\mathbb{N}&\to\mathbb{R}^\mathbb{N}\\ (x_1,x_2,\dots)&\mapsto(0,x_1,x_2,\dots), \end{align} i.e, $b$ adds $0$ in the beginning of a sequence.
We have $a*b=a\circ b=e$, thus $a*b$ has an inverse. You can check that neither $a$ nor $b$ has an inverse.
Addendum: As requested by OP, I'll try to simplify the example above by giving a similar but simpler example.
Let $\mathbb{R}[X]$ be the set of polynomials with coefficients in $\mathbb R$, and let $X=\{\text{functions}\,\mathbb{R}[X]\to\mathbb{R}[X]\}$, so every element of $X$ is a function that maps polynomials to polynomials. Let $*$ be composition of functions in $X$, i.e, if $a,b\in X$, then $a*b=a\circ b$. Clearly $*$ is associative and \begin{align} e:\mathbb{R}[X]&\to\mathbb{R}[X]\\ P(X)&\mapsto P(X) \end{align} is the identity element of $(X,*)$.
We give two elements $a,b\in X$ such that $a*b=e$ yet $b*a\neq e$, which shows that $a$ and $b$ don't have inverses.
Let \begin{align} a:\mathbb{R}[X]&\to\mathbb{R}[X]\\ \sum_{i=0}^n\alpha_i X^i&\mapsto\sum_{i=1}^n\alpha_{i}X^{i-1}. \end{align}
What $a$ does is that it removes the constant term of a polynomial and reduces the degree of the other terms by $1$. One can define it by the following formula for $P(X)$ nonconstant polynomial: $$a(P(X))=\dfrac{P(X)-P(0)}{X}$$ and with $a(c(X))=0$ for constant polynomials.
Examples:
$a(5+2X+7X^2-3X^3)=2+7X-3X^2;$
$a(3-2X+9X^3)=-2+9X^2;$
$a(\sqrt{2}X-3X^3)=\sqrt{2}-3X^2.$
Now let \begin{align} b:\mathbb{R}[X]&\to\mathbb{R}[X]\\ P(X)&\mapsto XP(X). \end{align}
$b$ simply multiplies the polynomial by $X$, which has the opposite effect on the degrees compared to $a$. For example, $b(\sqrt{2}-3X^2)=\sqrt{2}X-3X^3$.
Now I claim that $a*b=e$. Indeed, if $P(X)$ is a polynomial, $b(P(X))=XP(X)$. What $b$ did is that it added $1$ degree to every term. Now what would $a(b(P(X))$ be? $a$ removes the constant term and then substracts the degree of the other terms by $1$. Since $XP(X)$ has no constant term, all $a$ does is substract by $1$ the degree of all the other terms, so it did precisely the inverse of what $b$ did, therefore $a(b(P(X))=P(X)$.
We can also easly see it as follows: for any nonzero $P(X)\in\mathbb{R}[X]$, $b(P(X))=XP(X)$ is a nonconstant polynomial, so we can use the formula: \begin{align} a(b(P(X))&=a(XP(X))\\ &=\dfrac{XP(X)-0\times P(0)}{X}\\ &=\dfrac{XP(X)}{X}\\ &=P(X). \end{align} Now I claim $b*a\neq e$. For this an example is enough: let $P(X)=1+2X+3X^2$. $a$ removes the constant term and decreases the degree of the other terms by $1$, so $a(P(X))=2+3X$. Although $b$ increases the degree by $1$, b cannot come up with the "deleted" constant term by $a$, thus $$b(a(P(X)))=b(2+3X)=2X+3X^2\neq P(X).$$