I am trying to prove that if the entropy ($H$) of a random variable with $n$ possible outcomes is less than one bit, one of the outcomes must have probability larger than 0.5. Intuitively I think that this is the case, and I manually checked for several possible distributions of the random variable. For $n=2$, this can be easily seen in the graph of the entropy. However, I couldn't prove it analytically for an arbitrary $n$ and could not find any proof online.
2026-03-26 04:51:13.1774500673
Proof for $H < 1$ (bits) $\implies$ $p_i > 0.5$ for some $i$
26 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in PROOF-VERIFICATION
- how is my proof on equinumerous sets
- Existence of a denumerble partition.
- Confirmation of Proof: $\forall n \in \mathbb{N}, \ \pi (n) \geqslant \frac{\log n}{2\log 2}$
- Calculating probabilities using Markov chains.
- Solution to a hard inequality
- Given a function, prove that it's injective
- Is the following set open/closed/compact in the metric space?
- Surjective function proof
- Possible Error in Dedekind Construction of Stillwell's Book
- Proving dual convex cone property
Related Questions in PROOF-WRITING
- how is my proof on equinumerous sets
- Do these special substring sets form a matroid?
- How do I prove this question involving primes?
- Total number of nodes in a full k-ary tree. Explanation
- Prove all limit points of $[a,b]$ are in $[a,b]$
- $\inf A = -\sup (-A)$
- Prove that $\sup(cA)=c\sup(A)$.
- Supremum of Sumset (Proof Writing)
- Fibonacci Numbers Proof by Induction (Looking for Feedback)
- Is my method correct for to prove $a^{\log_b c} = c^{\log_b a}$?
Related Questions in ENTROPY
- Relation between Shanon entropy via relation of probabilities
- How to maximise the difference between entropy and expected length of an Huffman code?
- Appoximation of Multiplicity
- Two questions about limits (in an exercise about the axiomatic definition of entropy)
- Computing entropy from joint probability table
- Joint differential entropy of sum of random variables: $h(X,X+Y)=h(X,Y)$?
- What is the least prime which has 32 1-bits?
- Eggs, buildings and entropy
- Markov chains, entropy and mutual information
- Entropy and Maximum Mutual Information
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
We'll prove the contrapositive - if $p_i \le 0.5$ for all $i$, then $H(p_1, \ldots, p_n) \ge 1.$
Say that you have $p_1, \ldots, p_n$ with $p_1 + \cdots + p_n = 1$ and $p_i \le 0.5$ for all $i$. Then you have
$$ H(p_1, \ldots, p_n) = \sum_i -p_i \log_2 p_i $$
and since $p_i \le 0.5$, you have $-\log_2 p_i \ge -\log_2 0.5$, since the function $f(x) = -\log_2 x$ is decreasing. So you get
$$ H(p_1, \ldots, p_n) = \sum_i -p_i \log_2 p_i \ge \sum_i -p_i \log_2 0.5 = -\log_2 0.5 \sum_i p_i = \log_2 2 = 1.$$
Informally, the entropy $H(p_1, \ldots, p_n)$ is a weighted average of the information content of the event $i$, which is $-\log_2 p_i$. If $p_i \le 0.5$ for all $i$ then the information content of each event is at least one bit and so the entropy is at least one bit.