Decomposition of Random Variable (Information)?

169 Views Asked by At

I am wondering whether the following idea or something similar appears in a field such as statistics or information theory(?).

Take a random variable $Y$ which takes value $1$ or $2$ with equal probability. Now, consider a pair of independent random variables, $X_1$ and $X_2$. $X_1$ is either $A$ or $B$ with equal probability. $X_2$ is either "$(A,B)=(1,2)"$ or "$(A,B)=(2,1)$" with equal probability. In other words, $X_2$ is necessary to "interpret" $X_1$.

Now, observing $X_1$ or $X_2$ separately conveys no information about $Y$, but observing both tells us the realization of $Y$.(We can make this more formal by making $X_2$ correlated with $Y$ appropriately.)

Does this kind of "decomposition" appear in some field? Any related topic is much appreciated (my background is marketing, but I thought it would be valuable to post this here, thanks!)

1

There are 1 best solutions below

0
On

I would rather comment and present what I know is possible using information theory, but I cannot just yet, so here is a possible answer:

Using information theory you can show that (given $X_1$ and $Y$ are correlated) that they provide the same "amount" of information. If I understand correctly, $X_2$ is an RV that is true (lets say - 1) if $X_1=Y$ and false (0) if $X_1\neq Y$. So actually, $X_2|X_1,Y$ (the | stands for given) is actually deterministic!

In information theory, the mutual information is

...a measure mutual dependence between the two variables...

In our case - $I(X_2;X_1,Y)$, which denotes the mutual information between $X_2$ and $Y, X_1$ together, actually gets the maximum value which is $log_2(|2|)=1$. (since $X_1,Y$ are both binary RV with equal probabilities, $X_2$ is also binary with equal probabilities).

Is this the kind of "decomposition" you were looking for?