Information entropy of sum of two discrete random variables

811 Views Asked by At

Is it true that $H(X_1+X_2, X_1) = H(X_1) + H(X_2)$ where $X_1$ and $X_2$ are independent random variables and $H(X)$ is an information entropy?

1

There are 1 best solutions below

0
On BEST ANSWER

If you have two random variables $X_1$, $X_2$ defined over an additive group $G$ of at most countable element, using chain rule, we have: $$ H(X_1+X_2,X_1)=H(X_1)+H(X_2+X_1|X_1), $$ See that: $$ H(X_2+X_1|X_1)=\mathbb E(-\log\mathbb P(X_1+X_2|X_1)) $$ And from independence of $X_1$ and $X_2$, we have: $$ \mathbb P(X_1+X_2=z|X_1=x)=\mathbb P(X_2=z-x|X_1=x)=\mathbb P(X_2=z-x). $$ Therefore $$ H(X_2+X_1|X_1)=\mathbb E(-\log\mathbb P(X_1+X_2|X_1))\\ =\mathbb E(-\log\mathbb P(X_2))=H(X_2). $$