I am having trouble getting some intuition as to what graph entropy measures. The definition that I have is that given a graph $G$, $H(G) = \min_{X,Y}I(X ;Y)$, where $X$ is a uniformly random vertex in $G$ and $Y$ is an independent set containing $X$. Also, $I(X; Y)$ is the mutual information between $X$ and $Y$ defined by: $I(X; Y) = H(X) - H(X|Y)$, where $H$ is the regular entropy function.
First, could anyone provide some intuition as to what this is actually measuring? Also, if we are taking the minimum of both $X$ and $Y$, why does $X$ have to be a uniformly random vertex? If we are minimizing $I$, then there is some fixed vertex $X$ and independent set $Y$ such that $I$ is minimized. So why is there a notion of uniformly random vertex?
Thanks!
EDIT: I am using these notes as a reference for a reading group: http://homes.cs.washington.edu/~anuprao/pubs/CSE533Autumn2010/lecture4.pdf
$X$ is the source with maximal entropy (uniform distribution), and $Y$ is the set of distinguishable symbols (distinguishability is given by the edges). Graph entropy is trying to quantify the encoding capacity of such system for an arbitrary $Y$.