Let $f(x)$ be a probability density function $f(x) = 1$ on $x = [0,1]$, and entropy defined as $$H(p(x)) = -\int p(x) \log_2(p(x)) \, dx$$ where $p(x)$ is a pdf. Unless I've made an arithmetic error, the entropy of $H(f(x)) = 0$. $$H(f(x)) = -\int_0^1 1\log_2(1) \,dx = 0$$ Given that other uniform distributions maximize entropy, this seems counter-intuitive. Other than mathematical definitions, is there an intuitive explanation why this is true? (Assuming it is true.)
2026-03-29 15:44:19.1774799059
Entropy of $f(x)=1$
124 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in INTEGRATION
- How can I prove that $\int_0^{\frac{\pi}{2}}\frac{\ln(1+\cos(\alpha)\cos(x))}{\cos(x)}dx=\frac{1}{2}\left(\frac{\pi^2}{4}-\alpha^2\right)$?
- How to integrate $\int_{0}^{t}{\frac{\cos u}{\cosh^2 u}du}$?
- Show that $x\longmapsto \int_{\mathbb R^n}\frac{f(y)}{|x-y|^{n-\alpha }}dy$ is integrable.
- How to find the unit tangent vector of a curve in R^3
- multiplying the integrands in an inequality of integrals with same limits
- Closed form of integration
- Proving smoothness for a sequence of functions.
- Random variables in integrals, how to analyze?
- derive the expectation of exponential function $e^{-\left\Vert \mathbf{x} - V\mathbf{x}+\mathbf{a}\right\Vert^2}$ or its upper bound
- Which type of Riemann Sum is the most accurate?
Related Questions in LOGARITHMS
- Confirmation of Proof: $\forall n \in \mathbb{N}, \ \pi (n) \geqslant \frac{\log n}{2\log 2}$
- Extracting the S from formula
- How to prove the following inequality (log)
- Rewriting $(\log_{11}5)/(\log_{11} 15)$
- How to solve this equation with $x$ to a logarithmic power?
- Show that $\frac{1}{k}-\ln\left(\frac{k+1}{k}\right)$ is bounded by $\frac{1}{k^2}$
- Why do we add 1 to logarithms to get number of digits?
- Is my method correct for to prove $a^{\log_b c} = c^{\log_b a}$?
- How to prove the inequality $\frac{1}{n}+\frac{1}{n+1}+\cdots+\frac{1}{2n-1}\geq \log (2)$?
- Unusual Logarithm Problem
Related Questions in INFORMATION-THEORY
- KL divergence between two multivariate Bernoulli distribution
- convexity of mutual information-like function
- Maximizing a mutual information w.r.t. (i.i.d.) variation of the channel.
- Probability of a block error of the (N, K) Hamming code used for a binary symmetric channel.
- Kac Lemma for Ergodic Stationary Process
- Encryption with $|K| = |P| = |C| = 1$ is perfectly secure?
- How to maximise the difference between entropy and expected length of an Huffman code?
- Number of codes with max codeword length over an alphabet
- Aggregating information and bayesian information
- Compactness of the Gaussian random variable distribution as a statistical manifold?
Related Questions in ENTROPY
- Relation between Shanon entropy via relation of probabilities
- How to maximise the difference between entropy and expected length of an Huffman code?
- Appoximation of Multiplicity
- Two questions about limits (in an exercise about the axiomatic definition of entropy)
- Computing entropy from joint probability table
- Joint differential entropy of sum of random variables: $h(X,X+Y)=h(X,Y)$?
- What is the least prime which has 32 1-bits?
- Eggs, buildings and entropy
- Markov chains, entropy and mutual information
- Entropy and Maximum Mutual Information
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
Differential Entropy can actually be negative, which is one of it's drawbacks. It just so happens that on $[0,1]$ all continuous distribution entropies are negative except for the uniform distribution. Let $h(x)$ be any continuous distribution on $[0,1]$ and $u(x)=1$ be the uniform distribution. Here's the proof using KL divergence notation::
$$0\leq D_{KL}(h(x))||u(x))=\int_0^1 h(x) \log( h(x)/u(x))dx=-H(h(x))-\int_0^1 h(x)\log(u(x))dx=-H(h(x)),$$
since $u(x)=1$. So
$$H(h(x))\leq 0.$$
By the way, positivity of KL divergence is a consequence of Jenson's inequality:
$$D_{KL}(f||g)=\int\log \left(\frac{f(x)}{g(x)}\right)[f(x)dx]=\int-\log(g(x)/f(x)) [f(x)dx]\geq -\log(\int g(x)dx)=0$$