We know that
$$\text{Cov}(x,z) = E(xz) - E(x)E(z)$$
Can it be the case that $\text{Cov}(x,z)>0$ but $E(xz)<0$$?
Or viceversa: $\text{Cov}(x,z)<0$ but $E(xz)>0$?
In other words, is the sign of a covariance determined with certainty by the first element of its decomposition?
Since $E(x)E(z)$ can have any sign, of course you can have either of those.
For example, with $x$ and $z$ taking values $\pm 1$, consider the joint probabilities given by the following tables:
$$ \left[ \begin {array}{c|cc} x \backslash z &-1& 1 \\\hline -1& 0.6 & 0.2 \\ 1 & 0.2 & 0 \end {array} \right] $$
where $E(x) = E(z) = -0.6$, $E(xz) = 0.2$, $\text{Cov}(x,z) = -0.16$, and
$$ \left[ \begin {array}{c|cc} x \backslash z &-1& 1 \\\hline -1& 0.2 & 0.6 \\ 1 & 0 & 0.2 \end {array} \right] $$ where $E(x) = -0.6$, $E(z) = 0.6$, $E(xy) = -0.2$, $\text{Cov}(x,z) = +0.16$.