I'm trying to get a better understanding of conditional independence. Specifically, I am interested in what is sufficient to condition on. I have read this thread and this thread but they do not explain the particular aspect that I am trying to understand better.
Let X be a random variable, $X\in \mathbb{R}$ and
$Y=h(X)$,
where h is some function. It trivially follows that
$Y \perp\!\!\!\perp X |X $
and
$Y \perp\!\!\!\perp X |h(X) $.
However, there are clearly other functions of X also fulfilling this, i.e. $Y \perp\!\!\!\perp X |g(X)$.
My question is: What are the (measure theoretic?) conditions on $g(X)$ for this to hold?
My attempt: Intuitively, as long as $g(X)$ contains more or equally as much "information" about $X$ as $h(X)$ it should hold. For example if $h(X)=|X|$ and $g(X)=X^2$, I would suspect it to hold that
$Y \perp\!\!\!\perp X |g(X)$
However, with my limited measure theoretic knowledge it is difficult to confirm/reject this formally. I am temped to, based on the results in van Putten and van Schuppen (1985), write something like
$Y \perp\!\!\!\perp X |g(X)$
iff
$\sigma(g(X)) \supseteq \sigma(h(X))$
But then I do not really know what $\sigma(g(X))$ and $\sigma(h(X))$ is in the example above, although I think I have seen every clip on youtube regarding the definition of a $\sigma$-algebra by now.
Any help is much appreciated!