How to calculate this entropy $H(X)=H(Y)+(p_{m-1}+p_{m})\cdot H(\frac{p_{m-1}}{p_{m-1}+p_{m}},\frac{p_{m}}{p_{m-1}+p_{m}})$

57 Views Asked by At

I'm given two random variables X and Y such that, enter image description here

Meaning that X and Y have the same distribution up to $1,...,m-2$.

I'm trying to show that $H(X)=H(Y)+(p_{m-1}+p_{m})\cdot H(\frac{p_{m-1}}{p_{m-1}+p_{m}},\frac{p_{m}}{p_{m-1}+p_{m}})$.

I can see that up to $m-2$ I will have $H(Y)+p_{m-1}log(\frac{1}{p_{m-1}})+p_{m}log(\frac{1}{p_{m}})$ from here I can't figure a way out to factor and reach my goal.

1

There are 1 best solutions below

0
On BEST ANSWER

I have reached an answer thanks to @Stelios.

$$H(X)=H(Y)+p_{m-1}log(\frac{p_{m-1}+p_{m}}{p_{m-1}})+p_{m}log(\frac{p_{m-1}+p_{m}}{p_{m}})=H(Y)+(P_{m-1}+p_m)\cdot(\frac{p_{m-1}}{p_{m-1}+p_{m}}log(\frac{p_{m-1}+p_{m}}{p_{m-1}})+\frac{p_{m}}{p_{m-1}+p_{m}}log(\frac{p_{m-1}+p_{m}}{p_{m}}))=H(Y)+(P_{m-1}+p_m)\cdot\sum{p\cdot{log{\frac{1}{p}}}}=H(Y)+(P_{m-1}+p_m)\cdot{H(\frac{p_{m-1}}{p_{m-1}+p_{m}},\frac{p_{m}}{p_{m-1}+p_{m}})}.$$