Let $(\Xi,\mathcal{E})$ be a measurable space and $\xi$ and $\xi'$ random variables with distributions $\mu$ and $\vartheta$ respectively in this space.
We say that the measure $\Pi$ in $\Xi^{2}$ is the coupling of measures $\mu$ and $\vartheta$ if $\Pi$ is a joint distribution of $\xi$ and $\xi'$ with marginals $\mu$ and $\vartheta$ respectively.
What is the difference between the coupling and the produc measure?
How is the law of Total probability for couplings?
How can I justify the equation (1)? [This is the most important for me]
Remark: This doubt is born because I am reading an article in this link in pag 12 which use the law of total probability but in a coupling. In short, what I do not understand is how from the law of total probability we can infer the following:
If we suppose $\xi_{i}\in \Xi$ for $i=1,2,\ldots, N$ and ${\displaystyle \vartheta=\frac{1}{N}\sum_{i=1}^{N}\delta_{\xi_{i}} }$, then by the law of total probability follows $$\Pi=\frac{1}{N}\sum_{i=1}^{N}\delta_{\xi_{i}}\otimes \mathbb{Q}_{i} \tag{1}$$ where $\mathbb{Q}_{i}$ is the conditional distribution of $\xi$ given $\xi_{i}$.
The product measure is one particular coupling, namely the one where the two random variables (associated with each marginal) are independent. In general there are many other distributions that have the same marginals.
The law of total probability for discrete random variables looks like $$P(X=i, Y=j) = P(X=i) \cdot P(Y=j \mid X=i).$$ The one you have written is just a more general version. Note that the conditional distribution $P(Y \mid X=i)$ (or, $\mathbb{Q}_i$ in your example) will depend on the joint distribution (the coupling) of $X$ and $Y$. In the special case of the product measure, we just have $P(Y=j \mid X=i) = P(Y = j)$, and thus you get the "product" in "product measure": $P(X=i, Y=j) = P(X=i) \cdot P(Y=j)$.
Edit: let $X$ be a random variable that takes values $\xi_1,\ldots,\xi_N$ each with probability $1/N$. Then $\vartheta(A) = P(X \in A)$ for any measurable set $A$. Let $Y$ be a random variable (possibly dependent on $X$) such that its marginal distribution is given by $\mu$, that is, $P(Y \in B) = \mu(B)$ for any measurable set $B$.
Let $\mathbb{Q}_i$ be the conditional distribution defined by $\mathbb{Q}_i(B) = P(Y \in B \mid X=\xi_i)$.
The law of total probability states $$P(X=\xi_j, Y \in B) = P(X = \xi_j) \cdot P(Y \in B \mid X=\xi_j) = \frac{1}{N} \delta_j(\{\xi_j\}) \mathbb{Q}_j(B) = \frac{1}{N} \sum_{i=1}^N \delta_{\xi_i}(\{\xi_j\}) \mathbb{Q}_i(B).$$