If we define two random variable by knowing their joint law ($ \Pi(\mu, \nu) $ is any measure on $\mathbb R^2$ with marginals $\mu,\nu$), with the additional property related to their conditional expectation :
$$ \pi = \text{Law} (X, Y) \in \Pi(\mu, \nu) : \forall x \in \mathbb R, \ \mathbb E ( Y | X = x) = x, \mu-\text{almost surely.} $$
How one interprets $\mathbb E ( Y | X = x) = x$ here? I struggle to understand what it means physically and geometrically, but also if it is related to some kind of projection? Any help is welcomed.
My idea si far is that the mean of the slice with $X=x$ equals $x$, but it seems to be an odd condition to set. The setting of work is optimal transport and curtains theory.
The convincing answer I found after studying actively the topic, is that $X$ is actually the best knowledge we have of $Y$. This is exactly what appears in martingales, where $$\mathbb E [ X_{n+1} | X_n, X_{n-1}, \cdots ] = X_n $$
in this case, we are looking at some kind of sequence of random variables through $\{X,Y\}$. Another important thing, is that it does not say anything to the rest of the marginals. If we would like to order them (convex order for example), we would need some other constrains. I am thinking for example of Strassen's theorem :
Defining for $\pi \in \mathcal M(\mathbb R^d)$, $C_{\pi} = \int (y - x )^+ d \pi ( y )$. \begin{align} & \mu \preceq_C \nu \\ &\iff \notag \\ &\mathbb E (\mu) = \mathbb E ( \nu ) \text{ and } \forall x \in \mathbb R, C_{\mu} (x) \leq C_{\nu} (x) \end{align}