Information content of a function on graph vertices

27 Views Asked by At

Consider a graph $\mathcal{G}(V,E)$ and a function $F: V \rightarrow \mathbb{R}$ on the vertices $V$ of this graph that maps each vertex to a scalar value. I have the intuition that, from an information theoretic perspective, $F$ will have very low information content if the mapping is trivially to a single scalar, i.e. $F: V \rightarrow i \in \mathbb{R}$; conversely, the information content will be maximally high if the mapping is essentially random with respect to the structure of the underlying graph $\mathcal{G}$. However, I do not have any concrete arguments to justify this intuition. I am hoping for some resource (or answer) that points me in the right direction regarding, ideally, rigorous proofs for this intuition; additionally, methods to calculate the information content of the mapping $F$ will be greatly appreciated.

1

There are 1 best solutions below

0
On

Let's consider the simpler problem of Bernoulli Random variables for simplicity, you may consider a different model, but have to provide a probability distribution for the random variable.

The entropy of a message is defined as the expected amount of information to be transmitted about the random variable $X$ that takes on the states $x_1, x_2, \ldots, x_n$. The entropy is defined as

$$I(X) = \sum_{i=1}^n p_i\log_2\left(\frac{1}{p_i}\right)$$

If the function F maps to a single value: $p_i = 1$, Then $I(X) = 0$. Similarly, if X is Bernoulli$(p=\frac{1}{2})$, $I(X) = \frac{1}{2}log_2\left(\frac{1}{\frac{1}{2}}\right)+\left(1-\frac{1}{2}\right)log_2\left(\frac{1}{1-\frac{1}{2}}\right) = 2$ for a single vertex. For multiple vertices, the information is additive.

See here for more details