As an example, if we have a fair die with 6 faces with each face corresponds to number 1-6, its entropy is $H(X) = -\sum_{i=1}^{6} \frac{1}{6} \log_2\left(\frac{1}{6}\right) \approx 2.58 $. If we have a fair die with only 5 faces with each face corresponds to number 1-5, its entropy is $H(X') = -\sum_{i=1}^{5} \frac{1}{5} \log_2\left(\frac{1}{5}\right) \approx 2.32 $. In this case, reducing the size of sample space resulted in reduction in random variable's entropy. However, this is not true in general (think of an extremely unfair die with $P(1) = 0.995$, $P(2-6) = 0.001$, and then remove face 1). I wonder if there is some existing result in information theory that gives a condition on when this reduction in entropy will hold for discrete random variables. My intuition is, we need to place a constraint on the outcome with the highest probability. In particular, this probability shouldn't be larger than the outcome with the lowest probability by some factor that scales with the size of the sample space.
2026-03-29 02:36:07.1774751767
What is the condition for a guaranteed reduction in entropy when shrinking the sample space
32 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in INFORMATION-THEORY
- KL divergence between two multivariate Bernoulli distribution
- convexity of mutual information-like function
- Maximizing a mutual information w.r.t. (i.i.d.) variation of the channel.
- Probability of a block error of the (N, K) Hamming code used for a binary symmetric channel.
- Kac Lemma for Ergodic Stationary Process
- Encryption with $|K| = |P| = |C| = 1$ is perfectly secure?
- How to maximise the difference between entropy and expected length of an Huffman code?
- Number of codes with max codeword length over an alphabet
- Aggregating information and bayesian information
- Compactness of the Gaussian random variable distribution as a statistical manifold?
Related Questions in ENTROPY
- Relation between Shanon entropy via relation of probabilities
- How to maximise the difference between entropy and expected length of an Huffman code?
- Appoximation of Multiplicity
- Two questions about limits (in an exercise about the axiomatic definition of entropy)
- Computing entropy from joint probability table
- Joint differential entropy of sum of random variables: $h(X,X+Y)=h(X,Y)$?
- What is the least prime which has 32 1-bits?
- Eggs, buildings and entropy
- Markov chains, entropy and mutual information
- Entropy and Maximum Mutual Information
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
Suppose WLOG that $X$ takes $n$ values in $1,2,\cdots n$ and let $Y$ be the resulting variable after removing value $n$. Let $p_n = P(X_n)=P(X=n)$.
Then $X$ can be considered as a non-overlapping mixture of two distributions , one with $n-1$ values (rv $Y$) and one with a single value. Then (see eg):
$$H(X) = h(p_n) + p_n \times 0 + (1-p_n) H(Y)=h(p_n) + (1-p_n) H(Y) \tag1$$
where $h()$ is the binary entropy function, and hence
$$H(X) > H(Y) \iff H(X) < \frac{h(p_n)}{p_n} =- \log(p_n) - \frac{1-p_n}{p_n}\, \log(1-p_n) \tag2$$
In particular, if $X$ is uniform, $p_n = 1/n$ and the inequality takes the form
$$ \log n < \log n - (n-1)\log(1-1/n) $$
which of course is true.
I'm not sure what more can be said of $(2)$ in general.