Clearly, if $g$ is a deterministic function then for two random variables $X, Y$:
$X \rightarrow Y \rightarrow g(Y)$ i.e. they form a Markov Chain.
If $g$ is a one-to-one mapping we can further state: $X \rightarrow g(Y) \rightarrow Y$ as $g(Y)$ is a sufficient statistics.
I am wondering about the converse of the above statement. If $X \rightarrow g(Y) \rightarrow Y$ holds, can we conclude that $g$ is injective?
I tried proving it but am a bit uncertain about the steps taken. Could somebody have a look at my argumentation and point out the flaws?
Here it is:
Let $X \rightarrow g(Y) \rightarrow Y$. This implies that $X$ and $Y$ are conditionally independent given $g(Y)$. We can thus write: $p(X=x|g(Y)=g(y), Y=y) = p(X=x|g(Y) = g(y))$. However: $p(X=x|g(Y)=g(y), Y=y) = p(X=x|Y = y)$.
Therefore: $p(X=x|g(Y) = g(y)) = p(X=x|Y = y)$. Assume $g$ is not injective. Then there exist at least a pair $y_1, y_2$ such that $g(y_1) = g(y_2)$. Thereby:
$p(X=x|g(Y) = g(y_1)) = \sum_{y:g(y)=g(y_1)} p(X=x|Y = y) = p(X=x|Y = y_1) + p(X=x|Y = y_2) \neq p(X=x|g(Y) = y_1)$
By contradiction, $g$ is injective.
Just consider $X=f(g(Y))$ for a deterministic function $f$; then you have your Markov chain without any need for $g$ to be injective. The flaw of your proof is in the expansion of conditional probability. If only $y_1$ and $y_2$ give you $g(y_1)=g(y_2)=z$, then assuming $A_z=\{y_1,y_2\}$ $$ \mathbb P(X=x|g(Y) = z) =\frac{\mathbb P(X=x,Y\in A_z)}{\mathbb P(Y\in A_z)}=\sum_{y\in A_z)} \frac{\mathbb P(X=x|Y=y)\mathbb P(Y=y)}{\mathbb P(Y\in A_z)} $$ With the above assumption, all those probabilities are either one or zero in the same time, and hence we have: $$ \mathbb P(X=x|g(Y) = z)=\mathbb P(X=x|Y = y) \text{ for all } y\in A_z. $$