Prove the equivalence between two definitions of graph entropy

108 Views Asked by At

I am trying to prove that the following two definitions are equivalent.

Let $G=(V,E)$ be a graph. Let $\mathcal{A}$ denote the collection of all maximal independent sets of the graph, and $\mathcal{B}$ denote the collection of independent sets of the graph. Let $X$ be a random variable taking values in $V$ with a fixed distribution $P$, and $Y$ takes values in collection of independent sets. Prove that $$ \min_{X,Y:Y\in \mathcal{A}} I \left(X;Y\right)=\min_{X,Y:Y\in \mathcal{B}} I \left(X;Y\right), $$ where $X \in Y$ with probability 1, i.e, if $Y=J$ for some independent set $J$,then $P\left(X \in J|Y=J\right)=1$.

Can anyone guide me in proving this ? Why would the minimum be achieved at a maximal independent set in the R.H.S ?

1

There are 1 best solutions below

4
On

It's down to the minimisation. Rough idea - First recall that graph entropy is defined given the graph and the probability distribution of $X$. Let $Y_1$ be such that it takes values on disjoint independent sets, and $Y_2$ such that it rakes values on maximal independent sets. Note that there exists a many-one deterministic function $f:\mathcal{A} \to \mathcal{B}$ such that $Y_2 = f(Y_1)$. This immediately establishes that $H(X|Y_2) = H(X|f(Y_1)) \ge H(X|Y_1)$, which leads to the inequality $I(X;Y_1) \ge I(X|Y_2)$.

Now for any $Y_1$ raking values on independent sets, there exists a $Y_2$ taking values only on maximal independent sets such that the above inequality is satisfied. This means that there exists a RV that minimises the above expression and takes values only on maximal independent sets. Voila.