Let $A$ be an $m\times n$ matrix with coefficients in $\mathbb R$. Supposedly this fact is true, however when I compute it:
\begin{align} (A^+ A)^\ast &= A^\ast (A^+)^\ast \newline&= A^\ast ((A^\ast A)^{-1}A^\ast)^\ast \newline &= A^\ast A(A^\ast A)^{-1} \end{align}
Which is clearly not equal to $A^+ A$.
What's your definition of the pseudoinverse? Classically in more general case of any complex matrix, $A^+$ by definition is a unique matrix $A^+$ satisfying all of the four conditions:
So there is nothing to show. In case where $A$ has linearly independent columns, we get a more compact formula that you used $A^+ = (A^*A)^{-1}A^*$
In this case the fact is trivial, as $$(A^*A)^{-1}A^*A = I$$
Linear independence of columns is a necessary condition for the invertibility of $A^*A$. Similar formula can be derived in case of linear independent rows, as then $AA^*$ is non-singular and $$A^+ = A^*(AA^*)^{-1}$$
In this case $$A^+A = A^*(AA^*)^{-1}A = (A^{*}(A^{*}(AA^{*})^{-1})^*)^* = (A^* (AA^* )^{-1} A)^* = (A^+A)^*$$