I'm following the proof in McNulty and Gordon's graphic introduction to matroids.
The theorem isIf $\mathcal{M}$ is the set of isomorphism classes of matroids, then if $f:\mathcal{M}\rightarrow\mathbb{Z}[x, y]$ satisfies, 1) $M_1\cong M_2\Rightarrow f(M_1)=f(M_2)$
2) $e$ neither a loop nor isthmus $f(M)=af(M-e)+bf(M/e)$ for fixed $a, b\in\mathbb{Z}$.
3) $f(M)=f(I)f(M/e)$ for $I$ a matroid consisting of a single isthmus.
4) $f(M)=f(L)f(M-e)$ for $L$ a matroid consisting of a single loop.
then $f(M)=a^{|E|-r(E)}b^{r(E)}t\left(M, \frac{f(I)}{b}, \frac{f(L)}{a}\right)$.
In the text they take $a=b=1$ so let's just stick with that for simplicity.
My proof attempt begins as
Let $M$ be a matroid with ground set $E$.
If $|E|=1$, then $e$ either a loop or an isthmus.
To see this, let $F\subseteq E$ such that $e\notin F$.
Since $|E|=1$ and $e\in E, F=\emptyset$ and thus $\emptyset=F$ is the only set not containing $e$.
Thus if $e$ isn't a loop, since $F$ is the only set not containing $E$, $r(F\cup\{e\})\neq r(F)$.
Now adding a single element to a set can only increase the rank by at most one and thus $r(F\cup\{e\})=r(F)+1$, but $F=\emptyset$ is the only set not containing $e$, and thus this holds for all sets not containing $e$, so $e$ is by definition an isthmus.\
Since $|E|=1$, the ground set for $M-e$ and $M/e$ are both empty.
If $L$ is a loop then $f(M)=f(L)f(M-e)$ by assumption.
Now $t(M, x, y)=yt(M-e, x, y)=y$ and thus $t(M, f(I), f(L))=f(L)$
Here lies my problem because we have $t(M, f(I), f(L))=f(L)$, meanwhile $f(M)=f(L)f(M-e)$ so the inequality only holds if $f(M-e)=1$, but this is not a part of the hypothesis, and it's not at all clear to why this would have to be the case for matroid invariants.
EDIT/UPDATE: $M-e$ has ground set $\emptyset$, and it seems that it could be a reasonable thing that for any matroid invariant $f(N)=1$ when $N$ is the matroid with empty ground set, but it's not clear to me that this has to be the case.
Figured out my err, for if $M$ is an isthmus as a matroid, then we have that $f(I)=f(M)$ and by assumption $f(M)=f(I)f(M/e)$ thus $f(I)=f(I)f(M/e)\Rightarrow f(I)=0$ or $f(M/e)$ and the result follows.