Trace monoids are just partially commutative free monoids. In other words, this is the set of words where words which can be obtained by switching around certain pairs of letters are considered equivalent.
Formally, take a set of symbols $\Sigma$ and a relation (called the independence relation) $I \subseteq \Sigma \times \Sigma$ which is symmetric. Define a relation $\sim$ on the set of words $\Sigma^*$, so that $x \sim y$ if there exists $(a, b) \in I$ and $u_1, u_2 \in \Sigma^*$ such that $x = u_1 a b u_2$ and $y = u_1 b a u_2$. Let $\equiv$ be the transitive reflexive closure of $\sim$. We define the traces on $\Sigma$ induced by $I$ to be the quotient of $\Sigma^*$ under $\equiv$.
One can show that concatenation of $\Sigma^*$ is stable under $\equiv$, and thus $\Sigma^*/\equiv$ has a natural monoid structure.
I want to show that this monoid structure is cancellative. So, I want to show that $xy \equiv xz \implies y \equiv z$.
According to the discussion near (1.7) here, this is clear for $\sim$ and hence is clear for $\equiv$. I do not understand why the later part of their claim is straightforward.
Any help with this proof is appreciated.
After some thought, I found an elementary proof of this myself.
Let us focus on left cancellativity. Right cancellativity should be symmetric.
It is enough to prove the following:
$$\forall a \in \Sigma, x, y \in \Sigma^*, a \cdot x \equiv a \cdot y \implies x \equiv y \qquad (1)$$
Left cancellativity can be proved from this lemma using a simple induction on the term on the left.
To prove (1), we will show the following:
(Note that (d) readily means that $a$ commutes with the word $x_0$ itself.)
Now, let us see why (2) implies (1). Say there is some word $a \cdot x = a \cdot y$ as in the hypothesis of (1). Using (2), we see that there is a way to decompose $a\cdot y$ into $x_0 \cdot a \cdot x_1$ following the conditions. Since $x_0$ cannot have $a$ in it, it must be empty, which means $x_1 = x$. By condition (c), we have $y \equiv x_0 \cdot x_1 = x_1 = x$, like we wanted.
Lemma (2) looks straight forward by visual inspection. Indeed, it can be proven directly by induction on the proof of $ax' \equiv y$.
I have, however found that choosing the correct induction principle here can be tricky. I recommend using the following induction principle.
Showing (**) in our case requires us to think of the situation where we have $x_0 \cdot a \cdot x_1 \sim z$. This involves a bit of case work, where we inspect where the transposed pair is located. This is slightly tedious, but can be done.
I have checked this proof with Coq, so I believe this should be correct. You may need to assume that the independence relation is irreflexive, or that for all $x, y \in \Sigma$, either $x = y$ or $x \neq y$.