Consider a graph $\mathcal{G}(V,E)$ and a function $F: V \rightarrow \mathbb{R}$ on the vertices $V$ of this graph that maps each vertex to a scalar value. I have the intuition that, from an information theoretic perspective, $F$ will have very low information content if the mapping is trivially to a single scalar, i.e. $F: V \rightarrow i \in \mathbb{R}$; conversely, the information content will be maximally high if the mapping is essentially random with respect to the structure of the underlying graph $\mathcal{G}$. However, I do not have any concrete arguments to justify this intuition. I am hoping for some resource (or answer) that points me in the right direction regarding, ideally, rigorous proofs for this intuition; additionally, methods to calculate the information content of the mapping $F$ will be greatly appreciated.
2026-04-01 07:59:44.1775030384
Information content of a function on graph vertices
27 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in PROBABILITY
- How to prove $\lim_{n \rightarrow\infty} e^{-n}\sum_{k=0}^{n}\frac{n^k}{k!} = \frac{1}{2}$?
- Is this a commonly known paradox?
- What's $P(A_1\cap A_2\cap A_3\cap A_4) $?
- Prove or disprove the following inequality
- Another application of the Central Limit Theorem
- Given is $2$ dimensional random variable $(X,Y)$ with table. Determine the correlation between $X$ and $Y$
- A random point $(a,b)$ is uniformly distributed in a unit square $K=[(u,v):0<u<1,0<v<1]$
- proving Kochen-Stone lemma...
- Solution Check. (Probability)
- Interpreting stationary distribution $P_{\infty}(X,V)$ of a random process
Related Questions in STATISTICS
- Given is $2$ dimensional random variable $(X,Y)$ with table. Determine the correlation between $X$ and $Y$
- Statistics based on empirical distribution
- Given $U,V \sim R(0,1)$. Determine covariance between $X = UV$ and $V$
- Fisher information of sufficient statistic
- Solving Equation with Euler's Number
- derive the expectation of exponential function $e^{-\left\Vert \mathbf{x} - V\mathbf{x}+\mathbf{a}\right\Vert^2}$ or its upper bound
- Determine the marginal distributions of $(T_1, T_2)$
- KL divergence between two multivariate Bernoulli distribution
- Given random variables $(T_1,T_2)$. Show that $T_1$ and $T_2$ are independent and exponentially distributed if..
- Probability of tossing marbles,covariance
Related Questions in GRAPH-THEORY
- characterisation of $2$-connected graphs with no even cycles
- Explanation for the static degree sort algorithm of Deo et al.
- A certain partition of 28
- decomposing a graph in connected components
- Is it true that if a graph is bipartite iff it is class 1 (edge-coloring)?
- Fake induction, can't find flaw, every graph with zero edges is connected
- Triangle-free graph where every pair of nonadjacent vertices has exactly two common neighbors
- Inequality on degrees implies perfect matching
- Proving that no two teams in a tournament win same number of games
- Proving that we can divide a graph to two graphs which induced subgraph is connected on vertices of each one
Related Questions in INFORMATION-THEORY
- KL divergence between two multivariate Bernoulli distribution
- convexity of mutual information-like function
- Maximizing a mutual information w.r.t. (i.i.d.) variation of the channel.
- Probability of a block error of the (N, K) Hamming code used for a binary symmetric channel.
- Kac Lemma for Ergodic Stationary Process
- Encryption with $|K| = |P| = |C| = 1$ is perfectly secure?
- How to maximise the difference between entropy and expected length of an Huffman code?
- Number of codes with max codeword length over an alphabet
- Aggregating information and bayesian information
- Compactness of the Gaussian random variable distribution as a statistical manifold?
Related Questions in ENTROPY
- Relation between Shanon entropy via relation of probabilities
- How to maximise the difference between entropy and expected length of an Huffman code?
- Appoximation of Multiplicity
- Two questions about limits (in an exercise about the axiomatic definition of entropy)
- Computing entropy from joint probability table
- Joint differential entropy of sum of random variables: $h(X,X+Y)=h(X,Y)$?
- What is the least prime which has 32 1-bits?
- Eggs, buildings and entropy
- Markov chains, entropy and mutual information
- Entropy and Maximum Mutual Information
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
Let's consider the simpler problem of Bernoulli Random variables for simplicity, you may consider a different model, but have to provide a probability distribution for the random variable.
The entropy of a message is defined as the expected amount of information to be transmitted about the random variable $X$ that takes on the states $x_1, x_2, \ldots, x_n$. The entropy is defined as
$$I(X) = \sum_{i=1}^n p_i\log_2\left(\frac{1}{p_i}\right)$$
If the function F maps to a single value: $p_i = 1$, Then $I(X) = 0$. Similarly, if X is Bernoulli$(p=\frac{1}{2})$, $I(X) = \frac{1}{2}log_2\left(\frac{1}{\frac{1}{2}}\right)+\left(1-\frac{1}{2}\right)log_2\left(\frac{1}{1-\frac{1}{2}}\right) = 2$ for a single vertex. For multiple vertices, the information is additive.
See here for more details